So far so good
Some checks are pending
CI / Test (1.23) (push) Waiting to run
CI / Test (1.24) (push) Waiting to run
CI / Test (1.25) (push) Waiting to run
CI / Lint (push) Waiting to run
CI / Build (push) Waiting to run

This commit is contained in:
2025-12-16 18:10:40 +02:00
parent b9650739bf
commit 7c7054d2e2
44 changed files with 27029 additions and 48 deletions

151
CLAUDE.md Normal file
View File

@@ -0,0 +1,151 @@
# CLAUDE.md
This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
## Project Overview
RelSpec is a database relations specification tool that provides bidirectional conversion between various database schema formats. It reads database schemas from multiple sources (live databases, DBML, DCTX, DrawDB, etc.) and writes them to various formats (GORM, Bun, JSON, YAML, SQL, etc.).
## Build Commands
```bash
# Build the binary
make build # Outputs to build/relspec
go build -o build/relspec ./cmd/relspec
# Run tests
make test # Run all tests with race detection and coverage
go test ./... # Run tests without coverage
# Run a single test
go test -run TestName ./pkg/readers/dbml
# Linting
make lint # Requires golangci-lint installed
# Coverage report
make coverage # Generates coverage.html
# Clean build artifacts
make clean
# Install binary
make install # Installs to $GOPATH/bin
```
## Architecture
### Core Data Model (pkg/models/)
The central data structure is the `Database` model, which represents a complete database schema with this hierarchy:
```
Database
└── Schemas ([]Schema)
└── Tables ([]Table)
├── Columns (map[string]Column)
├── Constraints (map[string]Constraint)
├── Indexes (map[string]Index)
└── Relationships (map[string]Relationship)
```
**Key architectural decisions:**
- Tables use **maps** for Columns, Constraints, Indexes, and Relationships (keyed by name for O(1) lookup)
- Schemas use **slices** for Tables (order matters for generation)
- All model types implement `SQLNamer` interface (returns lowercase SQL-safe names)
- Use `Init*` functions (e.g., `InitTable()`, `InitSchema()`) to create properly initialized models with empty maps/slices
**Model Views:**
- `flatview.go`: Provides denormalized views with fully qualified names (e.g., `database.schema.table.column`)
- `summaryview.go`: Lightweight summary views with counts and essential metadata
- Use `.ToFlatColumns()`, `.ToSummary()` methods to convert between views
### Reader/Writer Pattern (pkg/readers/, pkg/writers/)
All readers and writers implement consistent interfaces with three granularity levels:
```go
// Reader interface
type Reader interface {
ReadDatabase() (*models.Database, error)
ReadSchema() (*models.Schema, error)
ReadTable() (*models.Table, error)
}
// Writer interface
type Writer interface {
WriteDatabase(db *models.Database) error
WriteSchema(schema *models.Schema) error
WriteTable(table *models.Table) error
}
```
**Important patterns:**
- Each format (dbml, dctx, drawdb, etc.) has its own `pkg/readers/<format>/` and `pkg/writers/<format>/` subdirectories
- Use `ReaderOptions` and `WriterOptions` structs for configuration (file paths, connection strings, metadata)
- Schema reading typically returns the first schema when reading from Database
- Table reading typically returns the first table when reading from Schema
### Transformation Layer (pkg/transform/)
The `Transformer` provides validation and normalization utilities. Note: validation methods are currently stubs (return nil) and need implementation when used.
### Database-Specific Utilities (pkg/pgsql/)
Contains PostgreSQL-specific helpers:
- `keywords.go`: SQL reserved keywords validation
- `datatypes.go`: PostgreSQL data type mappings and conversions
## Development Patterns
### Adding a New Reader
1. Create `pkg/readers/<format>/` directory
2. Implement the `Reader` interface with all three methods
3. Create a `NewReader(options *readers.ReaderOptions)` constructor
4. Parse format-specific data into the canonical `models.Database` structure
5. Use `models.Init*()` functions to create properly initialized structs
### Adding a New Writer
1. Create `pkg/writers/<format>/` directory
2. Implement the `Writer` interface with all three methods
3. Create a `NewWriter(options *writers.WriterOptions)` constructor
4. Transform canonical models into format-specific output
5. Handle file writing or other I/O in the writer implementation
### Working with Models
```go
// Creating models - ALWAYS use Init functions
db := models.InitDatabase("mydb")
schema := models.InitSchema("public")
table := models.InitTable("users", "public")
column := models.InitColumn("id", "users", "public")
// Adding to parent structures
schema.Tables = append(schema.Tables, table)
table.Columns["id"] = column // Use map key access for columns
db.Schemas = append(db.Schemas, schema)
// Accessing primary keys and foreign keys
pk := table.GetPrimaryKey() // Returns *Column or nil
fks := table.GetForeignKeys() // Returns []*Constraint
```
## CLI Implementation Status
The CLI in `cmd/relspec/main.go` is currently a placeholder showing usage examples. It will be implemented using the Cobra framework (already in dependencies).
## Testing
- Test files should be in the same package as the code they test
- Use table-driven tests for multiple test cases
- All tests run with race detection via `make test`
- Coverage reports available via `make coverage`
## Module Information
- Module path: `git.warky.dev/wdevs/relspecgo`
- Go version: 1.25.5
- Uses Cobra for CLI, Viper for configuration

View File

@@ -12,6 +12,8 @@ RelSpec provides bidirectional conversion between various database specification
- Transform legacy schema definitions (Clarion DCTX, XML, JSON)
- Generate standardized specification files (JSON, YAML)
![1.00](./assets/image/relspec1.jpg)
## Features
### Input Formats

View File

@@ -1,11 +1,6 @@
# RelSpec - TODO List
## Project Setup
- [ ] Initialize Go module (`go.mod`)
- [ ] Set up project directory structure
- [ ] Create `.editorconfig` for consistent formatting
- [ ] Add GitHub Actions for CI/CD
- [ ] Set up pre-commit hooks
## Core Infrastructure
- [ ] Define internal data model for database relations

BIN
assets/image/relspec1.jpg Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 171 KiB

BIN
assets/image/relspec2.jpg Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 80 KiB

BIN
assets/image/relspec3.jpg Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 192 KiB

View File

@@ -0,0 +1,88 @@
package models_bun
import "fmt"
import db "github.com/bitechdev/GoCore/pkg/models"
import "github.com/bitechdev/GoCore/pkg/types"
import "github.com/uptrace/bun"
import resolvespec_common "github.com/bitechdev/ResolveSpec/pkg/common"
//ModelCoreMasterprocess - Generated Table for Schema core
type ModelCoreMasterprocess struct {
bun.BaseModel `bun:"table:core.masterprocess,alias:masterprocess"`
Description resolvespec_common.SqlString `json:"description" bun:"description,type:citext,"`
GUID resolvespec_common.SqlUUID `json:"guid" bun:"guid,type:uuid,default:newid(),"`
Inactive resolvespec_common.SqlInt16 `json:"inactive" bun:"inactive,type:smallint,"`
Jsonvalue resolvespec_common.SqlJSONB `json:"jsonvalue" bun:"jsonvalue,type:jsonb,"`
Ridjsonschema resolvespec_common.SqlInt32 `json:"rid_jsonschema" bun:"rid_jsonschema,type:integer,"`
Ridmasterprocess resolvespec_common.SqlInt32 `json:"rid_masterprocess" bun:"rid_masterprocess,type:integer,pk,default:nextval('core.identity_masterprocess_rid_masterprocess'::regclass),"`
Ridmastertypehubtype resolvespec_common.SqlInt32 `json:"rid_mastertype_hubtype" bun:"rid_mastertype_hubtype,type:integer,"`
Ridmastertypeprocesstype resolvespec_common.SqlInt32 `json:"rid_mastertype_processtype" bun:"rid_mastertype_processtype,type:integer,"`
Ridprogrammodule resolvespec_common.SqlInt32 `json:"rid_programmodule" bun:"rid_programmodule,type:integer,"`
Sequenceno resolvespec_common.SqlInt32 `json:"sequenceno" bun:"sequenceno,type:integer,"`
Singleprocess resolvespec_common.SqlInt16 `json:"singleprocess" bun:"singleprocess,type:smallint,"`
Updatecnt int64 `json:"updatecnt" bun:"updatecnt,type:integer,default:0,"`
JSON *ModelCoreJsonschema `json:"JSON,omitempty" bun:"rel:has-one,join:rid_jsonschema=rid_jsonschema"`
MTT_RID_MASTERTYPE_HUBTYPE *ModelCoreMastertype `json:"MTT_RID_MASTERTYPE_HUBTYPE,omitempty" bun:"rel:has-one,join:rid_mastertype_hubtype=rid_mastertype"`
MTT_RID_MASTERTYPE_PROCESSTYPE *ModelCoreMastertype `json:"MTT_RID_MASTERTYPE_PROCESSTYPE,omitempty" bun:"rel:has-one,join:rid_mastertype_processtype=rid_mastertype"`
PMO *ModelPublicProgrammodule `json:"PMO,omitempty" bun:"rel:has-one,join:rid_programmodule=rid_programmodule"`
MTL []*ModelCoreMastertask `json:"MTL,omitempty" bun:"rel:has-many,join:rid_masterprocess=rid_masterprocess"`
PRO []*ModelCoreProcess `json:"PRO,omitempty" bun:"rel:has-many,join:rid_masterprocess=rid_masterprocess"`
db.DBAdhocBuffer `json:",omitempty" bun:",scanonly"`
db.DBGetIDInterface `json:",omitempty" bun:"-"`
types.SQLTypable `json:",omitempty" bun:"-"`
}
// TableName - Returns the table name for the object.
func (m ModelCoreMasterprocess) TableName() string {
return "core.masterprocess"
}
// TableName - Returns the table name for the object.
func (m ModelCoreMasterprocess) TableNameOnly() string {
return "masterprocess"
}
// SchemaName - Returns the schema name for the object.
func (m ModelCoreMasterprocess) SchemaName() string {
return "core"
}
// GetID - ID interface
func (m ModelCoreMasterprocess) GetID() int64 {
return m.Ridmasterprocess.Int64()
}
// GetIDStr - ID interface
func (m ModelCoreMasterprocess) GetIDStr() string {
return fmt.Sprintf("%d", m.Ridmasterprocess)
}
// SetID - ID interface
func (m ModelCoreMasterprocess) SetID(newid int64) {
m.UpdateID(newid)
}
func (m *ModelCoreMasterprocess) UpdateID(newid int64) {
m.Ridmasterprocess.FromString(fmt.Sprintf("%d", newid))
}
// GetIDName - ID interface
func (m ModelCoreMasterprocess) GetIDName() string {
return "rid_masterprocess"
}
// GetPrefix - Returns a table prefix
func (m ModelCoreMasterprocess) GetPrefix() string {
return "MPR"
}
// GetRowNumber - Returns the row number of the record
func (m ModelCoreMasterprocess) GetRowNumber() int64 {
return m.RowNumber
}
// SetRowNumber - Set the row number of a record
func (m *ModelCoreMasterprocess) SetRowNumber(num int64) {
m.RowNumber = num
}

View File

@@ -0,0 +1,96 @@
package models_bun
import "fmt"
import db "github.com/bitechdev/GoCore/pkg/models"
import "github.com/bitechdev/GoCore/pkg/types"
import "github.com/uptrace/bun"
import resolvespec_common "github.com/bitechdev/ResolveSpec/pkg/common"
//ModelCoreMastertask - Generated Table for Schema core
type ModelCoreMastertask struct {
bun.BaseModel `bun:"table:core.mastertask,alias:mastertask"`
Allactionsmustcomplete resolvespec_common.SqlInt16 `json:"allactionsmustcomplete" bun:"allactionsmustcomplete,type:smallint,"`
Condition resolvespec_common.SqlString `json:"condition" bun:"condition,type:citext,"`
Description resolvespec_common.SqlString `json:"description" bun:"description,type:citext,"`
Dueday resolvespec_common.SqlInt16 `json:"dueday" bun:"dueday,type:smallint,"`
Dueoption resolvespec_common.SqlString `json:"dueoption" bun:"dueoption,type:citext,"`
Escalation resolvespec_common.SqlInt32 `json:"escalation" bun:"escalation,type:integer,"`
Escalationoption resolvespec_common.SqlString `json:"escalationoption" bun:"escalationoption,type:citext,"`
GUID resolvespec_common.SqlUUID `json:"guid" bun:"guid,type:uuid,default:newid(),"`
Inactive resolvespec_common.SqlInt16 `json:"inactive" bun:"inactive,type:smallint,"`
Jsonvalue resolvespec_common.SqlJSONB `json:"jsonvalue" bun:"jsonvalue,type:jsonb,"`
Mastertasknote resolvespec_common.SqlString `json:"mastertasknote" bun:"mastertasknote,type:citext,"`
Repeatinterval resolvespec_common.SqlInt16 `json:"repeatinterval" bun:"repeatinterval,type:smallint,"`
Repeattype resolvespec_common.SqlString `json:"repeattype" bun:"repeattype,type:citext,"`
Ridjsonschema resolvespec_common.SqlInt32 `json:"rid_jsonschema" bun:"rid_jsonschema,type:integer,"`
Ridmasterprocess resolvespec_common.SqlInt32 `json:"rid_masterprocess" bun:"rid_masterprocess,type:integer,"`
Ridmastertask resolvespec_common.SqlInt32 `json:"rid_mastertask" bun:"rid_mastertask,type:integer,pk,default:nextval('core.identity_mastertask_rid_mastertask'::regclass),"`
Ridmastertypetasktype resolvespec_common.SqlInt32 `json:"rid_mastertype_tasktype" bun:"rid_mastertype_tasktype,type:integer,"`
Sequenceno resolvespec_common.SqlInt32 `json:"sequenceno" bun:"sequenceno,type:integer,"`
Singletask resolvespec_common.SqlInt16 `json:"singletask" bun:"singletask,type:smallint,"`
Startday resolvespec_common.SqlInt16 `json:"startday" bun:"startday,type:smallint,"`
Updatecnt int64 `json:"updatecnt" bun:"updatecnt,type:integer,default:0,"`
JSON *ModelCoreJsonschema `json:"JSON,omitempty" bun:"rel:has-one,join:rid_jsonschema=rid_jsonschema"`
MPR *ModelCoreMasterprocess `json:"MPR,omitempty" bun:"rel:has-one,join:rid_masterprocess=rid_masterprocess"`
MTT *ModelCoreMastertype `json:"MTT,omitempty" bun:"rel:has-one,join:rid_mastertype_tasktype=rid_mastertype"`
MAL []*ModelCoreMastertaskitem `json:"MAL,omitempty" bun:"rel:has-many,join:rid_mastertask=rid_mastertask"`
TAS []*ModelCoreTasklist `json:"TAS,omitempty" bun:"rel:has-many,join:rid_mastertask=rid_mastertask"`
db.DBAdhocBuffer `json:",omitempty" bun:",scanonly"`
db.DBGetIDInterface `json:",omitempty" bun:"-"`
types.SQLTypable `json:",omitempty" bun:"-"`
}
// TableName - Returns the table name for the object.
func (m ModelCoreMastertask) TableName() string {
return "core.mastertask"
}
// TableName - Returns the table name for the object.
func (m ModelCoreMastertask) TableNameOnly() string {
return "mastertask"
}
// SchemaName - Returns the schema name for the object.
func (m ModelCoreMastertask) SchemaName() string {
return "core"
}
// GetID - ID interface
func (m ModelCoreMastertask) GetID() int64 {
return m.Ridmastertask.Int64()
}
// GetIDStr - ID interface
func (m ModelCoreMastertask) GetIDStr() string {
return fmt.Sprintf("%d", m.Ridmastertask)
}
// SetID - ID interface
func (m ModelCoreMastertask) SetID(newid int64) {
m.UpdateID(newid)
}
func (m *ModelCoreMastertask) UpdateID(newid int64) {
m.Ridmastertask.FromString(fmt.Sprintf("%d", newid))
}
// GetIDName - ID interface
func (m ModelCoreMastertask) GetIDName() string {
return "rid_mastertask"
}
// GetPrefix - Returns a table prefix
func (m ModelCoreMastertask) GetPrefix() string {
return "MTL"
}
// GetRowNumber - Returns the row number of the record
func (m ModelCoreMastertask) GetRowNumber() int64 {
return m.RowNumber
}
// SetRowNumber - Set the row number of a record
func (m *ModelCoreMastertask) SetRowNumber(num int64) {
m.RowNumber = num
}

View File

@@ -0,0 +1,101 @@
package models_bun
import "fmt"
import db "github.com/bitechdev/GoCore/pkg/models"
import "github.com/bitechdev/GoCore/pkg/types"
import "github.com/uptrace/bun"
import resolvespec_common "github.com/bitechdev/ResolveSpec/pkg/common"
//ModelCoreMastertype - Generated Table for Schema core
type ModelCoreMastertype struct {
bun.BaseModel `bun:"table:core.mastertype,alias:mastertype"`
Category resolvespec_common.SqlString `json:"category" bun:"category,type:citext,"`
Description resolvespec_common.SqlString `json:"description" bun:"description,type:citext,"`
Disableedit resolvespec_common.SqlInt16 `json:"disableedit" bun:"disableedit,type:smallint,"`
Forprefix resolvespec_common.SqlString `json:"forprefix" bun:"forprefix,type:citext,"`
GUID resolvespec_common.SqlUUID `json:"guid" bun:"guid,type:uuid,default:newid(),"`
Hidden resolvespec_common.SqlInt16 `json:"hidden" bun:"hidden,type:smallint,"`
Inactive resolvespec_common.SqlInt16 `json:"inactive" bun:"inactive,type:smallint,"`
Jsonvalue resolvespec_common.SqlJSONB `json:"jsonvalue" bun:"jsonvalue,type:jsonb,"`
Mastertype resolvespec_common.SqlString `json:"mastertype" bun:"mastertype,type:citext,"`
Note resolvespec_common.SqlString `json:"note" bun:"note,type:citext,"`
Ridmastertype resolvespec_common.SqlInt32 `json:"rid_mastertype" bun:"rid_mastertype,type:integer,pk,default:nextval('core.identity_mastertype_rid_mastertype'::regclass),"`
Ridparent resolvespec_common.SqlInt32 `json:"rid_parent" bun:"rid_parent,type:integer,"`
Updatecnt int64 `json:"updatecnt" bun:"updatecnt,type:integer,default:0,"`
MTT *ModelCoreMastertype `json:"MTT,omitempty" bun:"rel:has-one,join:rid_mastertype=rid_parent"`
CMAT []*ModelCoreCommitem_Attachment `json:"CMAT,omitempty" bun:"rel:has-many,join:rid_mastertype=rid_mastertype"`
DVT []*ModelCoreDocumentvault `json:"DVT,omitempty" bun:"rel:has-many,join:rid_mastertype=rid_mastertype"`
EAD []*ModelCoreEmailaddresslist `json:"EAD,omitempty" bun:"rel:has-many,join:rid_mastertype=rid_mastertype"`
JSON []*ModelCoreJsonschema `json:"JSON,omitempty" bun:"rel:has-many,join:rid_mastertype=rid_mastertype"`
MAL []*ModelCoreMastertaskitem `json:"MAL,omitempty" bun:"rel:has-many,join:rid_mastertype=rid_mastertype_hubtype"`
MPR_RID_MASTERTYPE_HUBTYPE []*ModelCoreMasterprocess `json:"MPR_RID_MASTERTYPE_HUBTYPE,omitempty" bun:"rel:has-many,join:rid_mastertype=rid_mastertype_hubtype"`
MPR_RID_MASTERTYPE_PROCESSTYPE []*ModelCoreMasterprocess `json:"MPR_RID_MASTERTYPE_PROCESSTYPE,omitempty" bun:"rel:has-many,join:rid_mastertype=rid_mastertype_processtype"`
MSE []*ModelCoreMasterservice `json:"MSE,omitempty" bun:"rel:has-many,join:rid_mastertype=rid_mastertype_hubtype"`
MTL []*ModelCoreMastertask `json:"MTL,omitempty" bun:"rel:has-many,join:rid_mastertype=rid_mastertype_tasktype"`
MTT_RID_PARENT []*ModelCoreMastertype `json:"MTT_RID_PARENT,omitempty" bun:"rel:has-many,join:rid_mastertype=rid_parent"`
RUL []*ModelCoreMasterworkflowrule `json:"RUL,omitempty" bun:"rel:has-many,join:rid_mastertype=rid_mastertype_group"`
TAT_RID_MASTERTYPE_DOCGENTYPE []*ModelCoreMasterdoctemplate `json:"TAT_RID_MASTERTYPE_DOCGENTYPE,omitempty" bun:"rel:has-many,join:rid_mastertype=rid_mastertype_docgentype"`
TAT_RID_MASTERTYPE_DOCUMENT []*ModelCoreMasterdoctemplate `json:"TAT_RID_MASTERTYPE_DOCUMENT,omitempty" bun:"rel:has-many,join:rid_mastertype=rid_mastertype_document"`
TAT_RID_MASTERTYPE_GROUP []*ModelCoreMasterdoctemplate `json:"TAT_RID_MASTERTYPE_GROUP,omitempty" bun:"rel:has-many,join:rid_mastertype=rid_mastertype_group"`
TAT_RID_MASTERTYPE_HUBTYPE []*ModelCoreMasterdoctemplate `json:"TAT_RID_MASTERTYPE_HUBTYPE,omitempty" bun:"rel:has-many,join:rid_mastertype=rid_mastertype_hubtype"`
TAT_RID_MASTERTYPE_MERGETYPE []*ModelCoreMasterdoctemplate `json:"TAT_RID_MASTERTYPE_MERGETYPE,omitempty" bun:"rel:has-many,join:rid_mastertype=rid_mastertype_mergetype"`
TAT_RID_MASTERTYPE_TARGETTYPE []*ModelCoreMasterdoctemplate `json:"TAT_RID_MASTERTYPE_TARGETTYPE,omitempty" bun:"rel:has-many,join:rid_mastertype=rid_mastertype_targettype"`
db.DBAdhocBuffer `json:",omitempty" bun:",scanonly"`
db.DBGetIDInterface `json:",omitempty" bun:"-"`
types.SQLTypable `json:",omitempty" bun:"-"`
}
// TableName - Returns the table name for the object.
func (m ModelCoreMastertype) TableName() string {
return "core.mastertype"
}
// TableName - Returns the table name for the object.
func (m ModelCoreMastertype) TableNameOnly() string {
return "mastertype"
}
// SchemaName - Returns the schema name for the object.
func (m ModelCoreMastertype) SchemaName() string {
return "core"
}
// GetID - ID interface
func (m ModelCoreMastertype) GetID() int64 {
return m.Ridmastertype.Int64()
}
// GetIDStr - ID interface
func (m ModelCoreMastertype) GetIDStr() string {
return fmt.Sprintf("%d", m.Ridmastertype)
}
// SetID - ID interface
func (m ModelCoreMastertype) SetID(newid int64) {
m.UpdateID(newid)
}
func (m *ModelCoreMastertype) UpdateID(newid int64) {
m.Ridmastertype.FromString(fmt.Sprintf("%d", newid))
}
// GetIDName - ID interface
func (m ModelCoreMastertype) GetIDName() string {
return "rid_mastertype"
}
// GetPrefix - Returns a table prefix
func (m ModelCoreMastertype) GetPrefix() string {
return "MTT"
}
// GetRowNumber - Returns the row number of the record
func (m ModelCoreMastertype) GetRowNumber() int64 {
return m.RowNumber
}
// SetRowNumber - Set the row number of a record
func (m *ModelCoreMastertype) SetRowNumber(num int64) {
m.RowNumber = num
}

View File

@@ -0,0 +1,83 @@
package models_bun
import "fmt"
import db "github.com/bitechdev/GoCore/pkg/models"
import "github.com/bitechdev/GoCore/pkg/types"
import "github.com/uptrace/bun"
import resolvespec_common "github.com/bitechdev/ResolveSpec/pkg/common"
//ModelCoreProcess - Generated Table for Schema core
type ModelCoreProcess struct {
bun.BaseModel `bun:"table:core.process,alias:process"`
Completedate resolvespec_common.SqlDate `json:"completedate" bun:"completedate,type:date,"`
Completetime types.CustomIntTime `json:"completetime" bun:"completetime,type:integer,"`
Description resolvespec_common.SqlString `json:"description" bun:"description,type:citext,"`
GUID resolvespec_common.SqlUUID `json:"guid" bun:"guid,type:uuid,default:newid(),"`
Ridcompleteuser resolvespec_common.SqlInt32 `json:"rid_completeuser" bun:"rid_completeuser,type:integer,"`
Ridhub resolvespec_common.SqlInt32 `json:"rid_hub" bun:"rid_hub,type:integer,"`
Ridmasterprocess resolvespec_common.SqlInt32 `json:"rid_masterprocess" bun:"rid_masterprocess,type:integer,"`
Ridprocess resolvespec_common.SqlInt32 `json:"rid_process" bun:"rid_process,type:integer,pk,default:nextval('core.identity_process_rid_process'::regclass),"`
Status resolvespec_common.SqlString `json:"status" bun:"status,type:citext,"`
Updatecnt int64 `json:"updatecnt" bun:"updatecnt,type:integer,default:0,"`
HUB *ModelCoreHub `json:"HUB,omitempty" bun:"rel:has-one,join:rid_hub=rid_hub"`
MPR *ModelCoreMasterprocess `json:"MPR,omitempty" bun:"rel:has-one,join:rid_masterprocess=rid_masterprocess"`
TAS []*ModelCoreTasklist `json:"TAS,omitempty" bun:"rel:has-many,join:rid_process=rid_process"`
db.DBAdhocBuffer `json:",omitempty" bun:",scanonly"`
db.DBGetIDInterface `json:",omitempty" bun:"-"`
types.SQLTypable `json:",omitempty" bun:"-"`
}
// TableName - Returns the table name for the object.
func (m ModelCoreProcess) TableName() string {
return "core.process"
}
// TableName - Returns the table name for the object.
func (m ModelCoreProcess) TableNameOnly() string {
return "process"
}
// SchemaName - Returns the schema name for the object.
func (m ModelCoreProcess) SchemaName() string {
return "core"
}
// GetID - ID interface
func (m ModelCoreProcess) GetID() int64 {
return m.Ridprocess.Int64()
}
// GetIDStr - ID interface
func (m ModelCoreProcess) GetIDStr() string {
return fmt.Sprintf("%d", m.Ridprocess)
}
// SetID - ID interface
func (m ModelCoreProcess) SetID(newid int64) {
m.UpdateID(newid)
}
func (m *ModelCoreProcess) UpdateID(newid int64) {
m.Ridprocess.FromString(fmt.Sprintf("%d", newid))
}
// GetIDName - ID interface
func (m ModelCoreProcess) GetIDName() string {
return "rid_process"
}
// GetPrefix - Returns a table prefix
func (m ModelCoreProcess) GetPrefix() string {
return "PRO"
}
// GetRowNumber - Returns the row number of the record
func (m ModelCoreProcess) GetRowNumber() int64 {
return m.RowNumber
}
// SetRowNumber - Set the row number of a record
func (m *ModelCoreProcess) SetRowNumber(num int64) {
m.RowNumber = num
}

20994
examples/dctx/example.dctx Executable file

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,88 @@
package models
import (
"fmt"
db "github.com/bitechdev/GoCore/pkg/models"
"github.com/bitechdev/GoCore/pkg/types"
)
// ModelCoreMasterprocess - Generated Table for Schema core
type ModelCoreMasterprocess struct {
Description string `json:"description" gorm:"Column:description;type:citext;"`
GUID types.NullableUUID `json:"guid" gorm:"Column:guid;type:uuid;default:newid();"`
Inactive types.SInt16 `json:"inactive" gorm:"Column:inactive;type:smallint;"`
Jsonvalue types.NullableJSONB `json:"jsonvalue" gorm:"Column:jsonvalue;type:jsonb;"`
Ridjsonschema types.ZNullInt32 `json:"rid_jsonschema" gorm:"Column:rid_jsonschema;type:integer;"`
Ridmasterprocess int32 `json:"rid_masterprocess" gorm:"Column:rid_masterprocess;type:integer;primaryKey;default:nextval('core.identity_masterprocess_rid_masterprocess'::regclass);"`
Ridmastertypehubtype types.ZNullInt32 `json:"rid_mastertype_hubtype" gorm:"Column:rid_mastertype_hubtype;type:integer;"`
Ridmastertypeprocesstype types.ZNullInt32 `json:"rid_mastertype_processtype" gorm:"Column:rid_mastertype_processtype;type:integer;"`
Ridprogrammodule types.ZNullInt32 `json:"rid_programmodule" gorm:"Column:rid_programmodule;type:integer;"`
Sequenceno types.ZNullInt32 `json:"sequenceno" gorm:"Column:sequenceno;type:integer;"`
Singleprocess types.SInt16 `json:"singleprocess" gorm:"Column:singleprocess;type:smallint;"`
Updatecnt int64 `json:"updatecnt" gorm:"Column:updatecnt;type:integer;default:0;"`
//JSON *ModelCoreJsonschema `json:"JSON,omitempty" gorm:"references:rid_jsonschema;foreignKey:rid_jsonschema;"`
MTT_RID_MASTERTYPE_HUBTYPE *ModelCoreMastertype `json:"MTT_RID_MASTERTYPE_HUBTYPE,omitempty" gorm:"references:rid_mastertype_hubtype;foreignKey:rid_mastertype;"`
MTT_RID_MASTERTYPE_PROCESSTYPE *ModelCoreMastertype `json:"MTT_RID_MASTERTYPE_PROCESSTYPE,omitempty" gorm:"references:rid_mastertype_processtype;foreignKey:rid_mastertype;"`
//PMO *ModelPublicProgrammodule `json:"PMO,omitempty" gorm:"references:rid_programmodule;foreignKey:rid_programmodule;"`
MTL []*ModelCoreMastertask `json:"MTL,omitempty" gorm:"references:rid_masterprocess;foreignKey:rid_masterprocess;opt_c"`
PRO []*ModelCoreProcess `json:"PRO,omitempty" gorm:"references:rid_masterprocess;foreignKey:rid_masterprocess;opt_c"`
db.DBAdhocBuffer `json:",omitempty"`
db.DBGetIDInterface `json:",omitempty" gorm:"-"`
types.SQLTypable `json:",omitempty" gorm:"-"`
}
// TableName - Returns the table name for the object.
func (m ModelCoreMasterprocess) TableName() string {
return "core.masterprocess"
}
// TableName - Returns the table name for the object.
func (m ModelCoreMasterprocess) TableNameOnly() string {
return "masterprocess"
}
// SchemaName - Returns the schema name for the object.
func (m ModelCoreMasterprocess) SchemaName() string {
return "core"
}
// GetID - ID interface
func (m ModelCoreMasterprocess) GetID() int64 {
return int64(m.Ridmasterprocess)
}
// GetIDStr - ID interface
func (m ModelCoreMasterprocess) GetIDStr() string {
return fmt.Sprintf("%d", m.Ridmasterprocess)
}
// SetID - ID interface
func (m ModelCoreMasterprocess) SetID(newid int64) {
m.UpdateID(newid)
}
func (m *ModelCoreMasterprocess) UpdateID(newid int64) {
m.Ridmasterprocess = int32(newid)
}
// GetIDName - ID interface
func (m ModelCoreMasterprocess) GetIDName() string {
return "rid_masterprocess"
}
// GetPrefix - Returns a table prefix
func (m ModelCoreMasterprocess) GetPrefix() string {
return "MPR"
}
// GetRowNumber - Returns the row number of the record
func (m ModelCoreMasterprocess) GetRowNumber() int64 {
return m.RowNumber
}
// SetRowNumber - Set the row number of a record
func (m *ModelCoreMasterprocess) SetRowNumber(num int64) {
m.RowNumber = num
}

View File

@@ -0,0 +1,93 @@
package models
import "fmt"
import db "github.com/bitechdev/GoCore/pkg/models"
import "github.com/bitechdev/GoCore/pkg/types"
//ModelCoreMastertask - Generated Table for Schema core
type ModelCoreMastertask struct {
Allactionsmustcomplete types.SInt16 `json:"allactionsmustcomplete" gorm:"Column:allactionsmustcomplete;type:smallint;"`
Condition string `json:"condition" gorm:"Column:condition;type:citext;"`
Description string `json:"description" gorm:"Column:description;type:citext;"`
Dueday types.SInt16 `json:"dueday" gorm:"Column:dueday;type:smallint;"`
Dueoption string `json:"dueoption" gorm:"Column:dueoption;type:citext;"`
Escalation types.ZNullInt32 `json:"escalation" gorm:"Column:escalation;type:integer;"`
Escalationoption string `json:"escalationoption" gorm:"Column:escalationoption;type:citext;"`
GUID types.NullableUUID `json:"guid" gorm:"Column:guid;type:uuid;default:newid();"`
Inactive types.SInt16 `json:"inactive" gorm:"Column:inactive;type:smallint;"`
Jsonvalue types.NullableJSONB `json:"jsonvalue" gorm:"Column:jsonvalue;type:jsonb;"`
Mastertasknote string `json:"mastertasknote" gorm:"Column:mastertasknote;type:citext;"`
Repeatinterval types.SInt16 `json:"repeatinterval" gorm:"Column:repeatinterval;type:smallint;"`
Repeattype string `json:"repeattype" gorm:"Column:repeattype;type:citext;"`
Ridjsonschema types.ZNullInt32 `json:"rid_jsonschema" gorm:"Column:rid_jsonschema;type:integer;"`
Ridmasterprocess types.ZNullInt32 `json:"rid_masterprocess" gorm:"Column:rid_masterprocess;type:integer;"`
Ridmastertask int32 `json:"rid_mastertask" gorm:"Column:rid_mastertask;type:integer;primaryKey;default:nextval('core.identity_mastertask_rid_mastertask'::regclass);"`
Ridmastertypetasktype types.ZNullInt32 `json:"rid_mastertype_tasktype" gorm:"Column:rid_mastertype_tasktype;type:integer;"`
Sequenceno types.ZNullInt32 `json:"sequenceno" gorm:"Column:sequenceno;type:integer;"`
Singletask types.SInt16 `json:"singletask" gorm:"Column:singletask;type:smallint;"`
Startday types.SInt16 `json:"startday" gorm:"Column:startday;type:smallint;"`
Updatecnt int64 `json:"updatecnt" gorm:"Column:updatecnt;type:integer;default:0;"`
JSON *ModelCoreJsonschema `json:"JSON,omitempty" gorm:"references:rid_jsonschema;foreignKey:rid_jsonschema;"`
MPR *ModelCoreMasterprocess `json:"MPR,omitempty" gorm:"references:rid_masterprocess;foreignKey:rid_masterprocess;"`
MTT *ModelCoreMastertype `json:"MTT,omitempty" gorm:"references:rid_mastertype_tasktype;foreignKey:rid_mastertype;"`
MAL []*ModelCoreMastertaskitem `json:"MAL,omitempty" gorm:"references:rid_mastertask;foreignKey:rid_mastertask;opt_c"`
TAS []*ModelCoreTasklist `json:"TAS,omitempty" gorm:"references:rid_mastertask;foreignKey:rid_mastertask;opt_c"`
db.DBAdhocBuffer `json:",omitempty"`
db.DBGetIDInterface `json:",omitempty" gorm:"-"`
types.SQLTypable `json:",omitempty" gorm:"-"`
}
// TableName - Returns the table name for the object.
func (m ModelCoreMastertask) TableName() string {
return "core.mastertask"
}
// TableName - Returns the table name for the object.
func (m ModelCoreMastertask) TableNameOnly() string {
return "mastertask"
}
// SchemaName - Returns the schema name for the object.
func (m ModelCoreMastertask) SchemaName() string {
return "core"
}
// GetID - ID interface
func (m ModelCoreMastertask) GetID() int64 {
return int64(m.Ridmastertask)
}
// GetIDStr - ID interface
func (m ModelCoreMastertask) GetIDStr() string {
return fmt.Sprintf("%d", m.Ridmastertask)
}
// SetID - ID interface
func (m ModelCoreMastertask) SetID(newid int64) {
m.UpdateID(newid)
}
func (m *ModelCoreMastertask) UpdateID(newid int64) {
m.Ridmastertask = int32(newid)
}
// GetIDName - ID interface
func (m ModelCoreMastertask) GetIDName() string {
return "rid_mastertask"
}
// GetPrefix - Returns a table prefix
func (m ModelCoreMastertask) GetPrefix() string {
return "MTL"
}
// GetRowNumber - Returns the row number of the record
func (m ModelCoreMastertask) GetRowNumber() int64 {
return m.RowNumber
}
// SetRowNumber - Set the row number of a record
func (m *ModelCoreMastertask) SetRowNumber(num int64) {
m.RowNumber = num
}

View File

@@ -0,0 +1,98 @@
package models
import "fmt"
import db "github.com/bitechdev/GoCore/pkg/models"
import "github.com/bitechdev/GoCore/pkg/types"
//ModelCoreMastertype - Generated Table for Schema core
type ModelCoreMastertype struct {
Category string `json:"category" gorm:"Column:category;type:citext;"`
Description string `json:"description" gorm:"Column:description;type:citext;"`
Disableedit types.SInt16 `json:"disableedit" gorm:"Column:disableedit;type:smallint;"`
Forprefix string `json:"forprefix" gorm:"Column:forprefix;type:citext;"`
GUID types.NullableUUID `json:"guid" gorm:"Column:guid;type:uuid;default:newid();"`
Hidden types.SInt16 `json:"hidden" gorm:"Column:hidden;type:smallint;"`
Inactive types.SInt16 `json:"inactive" gorm:"Column:inactive;type:smallint;"`
Jsonvalue types.NullableJSONB `json:"jsonvalue" gorm:"Column:jsonvalue;type:jsonb;"`
Mastertype string `json:"mastertype" gorm:"Column:mastertype;type:citext;"`
Note string `json:"note" gorm:"Column:note;type:citext;"`
Ridmastertype int32 `json:"rid_mastertype" gorm:"Column:rid_mastertype;type:integer;primaryKey;default:nextval('core.identity_mastertype_rid_mastertype'::regclass);"`
Ridparent types.ZNullInt32 `json:"rid_parent" gorm:"Column:rid_parent;type:integer;"`
Updatecnt int64 `json:"updatecnt" gorm:"Column:updatecnt;type:integer;default:0;"`
MTT *ModelCoreMastertype `json:"MTT,omitempty" gorm:"references:rid_mastertype;foreignKey:rid_parent;"`
CMAT []*ModelCoreCommitem_Attachment `json:"CMAT,omitempty" gorm:"references:rid_mastertype;foreignKey:rid_mastertype;opt_c"`
DVT []*ModelCoreDocumentvault `json:"DVT,omitempty" gorm:"references:rid_mastertype;foreignKey:rid_mastertype;opt_c"`
EAD []*ModelCoreEmailaddresslist `json:"EAD,omitempty" gorm:"references:rid_mastertype;foreignKey:rid_mastertype;opt_c"`
JSON []*ModelCoreJsonschema `json:"JSON,omitempty" gorm:"references:rid_mastertype;foreignKey:rid_mastertype;opt_c"`
MAL []*ModelCoreMastertaskitem `json:"MAL,omitempty" gorm:"references:rid_mastertype;foreignKey:rid_mastertype_hubtype;opt_c"`
MPR_RID_MASTERTYPE_HUBTYPE []*ModelCoreMasterprocess `json:"MPR_RID_MASTERTYPE_HUBTYPE,omitempty" gorm:"references:rid_mastertype;foreignKey:rid_mastertype_hubtype;opt_c"`
MPR_RID_MASTERTYPE_PROCESSTYPE []*ModelCoreMasterprocess `json:"MPR_RID_MASTERTYPE_PROCESSTYPE,omitempty" gorm:"references:rid_mastertype;foreignKey:rid_mastertype_processtype;opt_c"`
MSE []*ModelCoreMasterservice `json:"MSE,omitempty" gorm:"references:rid_mastertype;foreignKey:rid_mastertype_hubtype;opt_c"`
MTL []*ModelCoreMastertask `json:"MTL,omitempty" gorm:"references:rid_mastertype;foreignKey:rid_mastertype_tasktype;opt_c"`
MTT_RID_PARENT []*ModelCoreMastertype `json:"MTT_RID_PARENT,omitempty" gorm:"references:rid_mastertype;foreignKey:rid_parent;opt_c"`
RUL []*ModelCoreMasterworkflowrule `json:"RUL,omitempty" gorm:"references:rid_mastertype;foreignKey:rid_mastertype_group;opt_c"`
TAT_RID_MASTERTYPE_DOCGENTYPE []*ModelCoreMasterdoctemplate `json:"TAT_RID_MASTERTYPE_DOCGENTYPE,omitempty" gorm:"references:rid_mastertype;foreignKey:rid_mastertype_docgentype;opt_c"`
TAT_RID_MASTERTYPE_DOCUMENT []*ModelCoreMasterdoctemplate `json:"TAT_RID_MASTERTYPE_DOCUMENT,omitempty" gorm:"references:rid_mastertype;foreignKey:rid_mastertype_document;opt_c"`
TAT_RID_MASTERTYPE_GROUP []*ModelCoreMasterdoctemplate `json:"TAT_RID_MASTERTYPE_GROUP,omitempty" gorm:"references:rid_mastertype;foreignKey:rid_mastertype_group;opt_c"`
TAT_RID_MASTERTYPE_HUBTYPE []*ModelCoreMasterdoctemplate `json:"TAT_RID_MASTERTYPE_HUBTYPE,omitempty" gorm:"references:rid_mastertype;foreignKey:rid_mastertype_hubtype;opt_c"`
TAT_RID_MASTERTYPE_MERGETYPE []*ModelCoreMasterdoctemplate `json:"TAT_RID_MASTERTYPE_MERGETYPE,omitempty" gorm:"references:rid_mastertype;foreignKey:rid_mastertype_mergetype;opt_c"`
TAT_RID_MASTERTYPE_TARGETTYPE []*ModelCoreMasterdoctemplate `json:"TAT_RID_MASTERTYPE_TARGETTYPE,omitempty" gorm:"references:rid_mastertype;foreignKey:rid_mastertype_targettype;opt_c"`
db.DBAdhocBuffer `json:",omitempty"`
db.DBGetIDInterface `json:",omitempty" gorm:"-"`
types.SQLTypable `json:",omitempty" gorm:"-"`
}
// TableName - Returns the table name for the object.
func (m ModelCoreMastertype) TableName() string {
return "core.mastertype"
}
// TableName - Returns the table name for the object.
func (m ModelCoreMastertype) TableNameOnly() string {
return "mastertype"
}
// SchemaName - Returns the schema name for the object.
func (m ModelCoreMastertype) SchemaName() string {
return "core"
}
// GetID - ID interface
func (m ModelCoreMastertype) GetID() int64 {
return int64(m.Ridmastertype)
}
// GetIDStr - ID interface
func (m ModelCoreMastertype) GetIDStr() string {
return fmt.Sprintf("%d", m.Ridmastertype)
}
// SetID - ID interface
func (m ModelCoreMastertype) SetID(newid int64) {
m.UpdateID(newid)
}
func (m *ModelCoreMastertype) UpdateID(newid int64) {
m.Ridmastertype = int32(newid)
}
// GetIDName - ID interface
func (m ModelCoreMastertype) GetIDName() string {
return "rid_mastertype"
}
// GetPrefix - Returns a table prefix
func (m ModelCoreMastertype) GetPrefix() string {
return "MTT"
}
// GetRowNumber - Returns the row number of the record
func (m ModelCoreMastertype) GetRowNumber() int64 {
return m.RowNumber
}
// SetRowNumber - Set the row number of a record
func (m *ModelCoreMastertype) SetRowNumber(num int64) {
m.RowNumber = num
}

View File

@@ -0,0 +1,80 @@
package models
import "fmt"
import db "github.com/bitechdev/GoCore/pkg/models"
import "github.com/bitechdev/GoCore/pkg/types"
//ModelCoreProcess - Generated Table for Schema core
type ModelCoreProcess struct {
Completedate types.CustomDate `json:"completedate" gorm:"Column:completedate;type:date;"`
Completetime types.CustomIntTime `json:"completetime" gorm:"Column:completetime;type:integer;"`
Description string `json:"description" gorm:"Column:description;type:citext;"`
GUID types.NullableUUID `json:"guid" gorm:"Column:guid;type:uuid;default:newid();"`
Ridcompleteuser types.ZNullInt32 `json:"rid_completeuser" gorm:"Column:rid_completeuser;type:integer;"`
Ridhub types.ZNullInt32 `json:"rid_hub" gorm:"Column:rid_hub;type:integer;"`
Ridmasterprocess types.ZNullInt32 `json:"rid_masterprocess" gorm:"Column:rid_masterprocess;type:integer;"`
Ridprocess int32 `json:"rid_process" gorm:"Column:rid_process;type:integer;primaryKey;default:nextval('core.identity_process_rid_process'::regclass);"`
Status string `json:"status" gorm:"Column:status;type:citext;"`
Updatecnt int64 `json:"updatecnt" gorm:"Column:updatecnt;type:integer;default:0;"`
HUB *ModelCoreHub `json:"HUB,omitempty" gorm:"references:rid_hub;foreignKey:rid_hub;"`
MPR *ModelCoreMasterprocess `json:"MPR,omitempty" gorm:"references:rid_masterprocess;foreignKey:rid_masterprocess;"`
TAS []*ModelCoreTasklist `json:"TAS,omitempty" gorm:"references:rid_process;foreignKey:rid_process;opt_c"`
db.DBAdhocBuffer `json:",omitempty"`
db.DBGetIDInterface `json:",omitempty" gorm:"-"`
types.SQLTypable `json:",omitempty" gorm:"-"`
}
// TableName - Returns the table name for the object.
func (m ModelCoreProcess) TableName() string {
return "core.process"
}
// TableName - Returns the table name for the object.
func (m ModelCoreProcess) TableNameOnly() string {
return "process"
}
// SchemaName - Returns the schema name for the object.
func (m ModelCoreProcess) SchemaName() string {
return "core"
}
// GetID - ID interface
func (m ModelCoreProcess) GetID() int64 {
return int64(m.Ridprocess)
}
// GetIDStr - ID interface
func (m ModelCoreProcess) GetIDStr() string {
return fmt.Sprintf("%d", m.Ridprocess)
}
// SetID - ID interface
func (m ModelCoreProcess) SetID(newid int64) {
m.UpdateID(newid)
}
func (m *ModelCoreProcess) UpdateID(newid int64) {
m.Ridprocess = int32(newid)
}
// GetIDName - ID interface
func (m ModelCoreProcess) GetIDName() string {
return "rid_process"
}
// GetPrefix - Returns a table prefix
func (m ModelCoreProcess) GetPrefix() string {
return "PRO"
}
// GetRowNumber - Returns the row number of the record
func (m ModelCoreProcess) GetRowNumber() int64 {
return m.RowNumber
}
// SetRowNumber - Set the row number of a record
func (m *ModelCoreProcess) SetRowNumber(num int64) {
m.RowNumber = num
}

3
go.mod
View File

@@ -3,6 +3,7 @@ module git.warky.dev/wdevs/relspecgo
go 1.25.5
require (
github.com/bitechdev/ResolveSpec v0.0.108 // indirect
github.com/fsnotify/fsnotify v1.9.0 // indirect
github.com/go-viper/mapstructure/v2 v2.4.0 // indirect
github.com/inconshreveable/mousetrap v1.1.0 // indirect
@@ -16,7 +17,7 @@ require (
github.com/spf13/viper v1.21.0 // indirect
github.com/subosito/gotenv v1.6.0 // indirect
go.yaml.in/yaml/v3 v3.0.4 // indirect
golang.org/x/sys v0.29.0 // indirect
golang.org/x/sys v0.35.0 // indirect
golang.org/x/text v0.28.0 // indirect
gopkg.in/yaml.v3 v3.0.1 // indirect
)

3
go.sum
View File

@@ -1,3 +1,5 @@
github.com/bitechdev/ResolveSpec v0.0.108 h1:0Asw4zt9SdBIDprNqtrGY67R4SovAPBmW2y1qRn/Wjw=
github.com/bitechdev/ResolveSpec v0.0.108/go.mod h1:/mtVcbXSBLNmWlTKeDnbQx18tmNqOnrpetpLOadLzqo=
github.com/cpuguy83/go-md2man/v2 v2.0.6/go.mod h1:oOW0eioCTA6cOiMLiUPZOpcVxMig6NIQQ7OS05n1F4g=
github.com/fsnotify/fsnotify v1.9.0 h1:2Ml+OJNzbYCTzsxtv8vKSFD9PbJjmhYF14k/jKC7S9k=
github.com/fsnotify/fsnotify v1.9.0/go.mod h1:8jBTzvmWwFyi3Pb8djgCCO5IBqzKJ/Jwo8TRcHyHii0=
@@ -29,6 +31,7 @@ go.yaml.in/yaml/v3 v3.0.4 h1:tfq32ie2Jv2UxXFdLJdh3jXuOzWiL1fo0bu/FbuKpbc=
go.yaml.in/yaml/v3 v3.0.4/go.mod h1:DhzuOOF2ATzADvBadXxruRBLzYTpT36CKvDb3+aBEFg=
golang.org/x/sys v0.29.0 h1:TPYlXGxvx1MGTn2GiZDhnjPA9wZzZeGKHHmKhHYvgaU=
golang.org/x/sys v0.29.0/go.mod h1:/VUhepiaJMQUp4+oa/7Zr1D23ma6VTLIYjOOTFZPUcA=
golang.org/x/sys v0.35.0/go.mod h1:BJP2sWEmIv4KK5OTEluFJCKSidICx8ciO85XgH3Ak8k=
golang.org/x/text v0.28.0 h1:rhazDwis8INMIwQ4tpjLDzUhx6RlXqZNPEM0huQojng=
golang.org/x/text v0.28.0/go.mod h1:U8nCwOR8jO/marOQ0QbDiOngZVEBB7MAiitBuMjXiNU=
gopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=

199
pkg/models/flatview.go Normal file
View File

@@ -0,0 +1,199 @@
package models
import "fmt"
// =============================================================================
// Flat/Denormalized Views - Flattened structures with fully qualified names
// =============================================================================
// FlatColumn represents a column with full context in a single structure
type FlatColumn struct {
DatabaseName string `json:"database_name" yaml:"database_name" xml:"database_name"`
SchemaName string `json:"schema_name" yaml:"schema_name" xml:"schema_name"`
TableName string `json:"table_name" yaml:"table_name" xml:"table_name"`
ColumnName string `json:"column_name" yaml:"column_name" xml:"column_name"`
FullyQualifiedName string `json:"fully_qualified_name" yaml:"fully_qualified_name" xml:"fully_qualified_name"` // database.schema.table.column
Type string `json:"type" yaml:"type" xml:"type"`
Length int `json:"length,omitempty" yaml:"length,omitempty" xml:"length,omitempty"`
Precision int `json:"precision,omitempty" yaml:"precision,omitempty" xml:"precision,omitempty"`
Scale int `json:"scale,omitempty" yaml:"scale,omitempty" xml:"scale,omitempty"`
NotNull bool `json:"not_null" yaml:"not_null" xml:"not_null"`
Default any `json:"default,omitempty" yaml:"default,omitempty" xml:"default,omitempty"`
AutoIncrement bool `json:"auto_increment" yaml:"auto_increment" xml:"auto_increment"`
IsPrimaryKey bool `json:"is_primary_key" yaml:"is_primary_key" xml:"is_primary_key"`
Description string `json:"description,omitempty" yaml:"description,omitempty" xml:"description,omitempty"`
Comment string `json:"comment,omitempty" yaml:"comment,omitempty" xml:"comment,omitempty"`
}
// ToFlatColumns converts a Database to a slice of FlatColumns
func (d *Database) ToFlatColumns() []*FlatColumn {
flatColumns := make([]*FlatColumn, 0)
for _, schema := range d.Schemas {
for _, table := range schema.Tables {
for _, column := range table.Columns {
flatColumns = append(flatColumns, &FlatColumn{
DatabaseName: d.Name,
SchemaName: schema.Name,
TableName: table.Name,
ColumnName: column.Name,
FullyQualifiedName: fmt.Sprintf("%s.%s.%s.%s", d.Name, schema.Name, table.Name, column.Name),
Type: column.Type,
Length: column.Length,
Precision: column.Precision,
Scale: column.Scale,
NotNull: column.NotNull,
Default: column.Default,
AutoIncrement: column.AutoIncrement,
IsPrimaryKey: column.IsPrimaryKey,
Description: column.Description,
Comment: column.Comment,
})
}
}
}
return flatColumns
}
// FlatTable represents a table with full context
type FlatTable struct {
DatabaseName string `json:"database_name" yaml:"database_name" xml:"database_name"`
SchemaName string `json:"schema_name" yaml:"schema_name" xml:"schema_name"`
TableName string `json:"table_name" yaml:"table_name" xml:"table_name"`
FullyQualifiedName string `json:"fully_qualified_name" yaml:"fully_qualified_name" xml:"fully_qualified_name"` // database.schema.table
Description string `json:"description,omitempty" yaml:"description,omitempty" xml:"description,omitempty"`
Comment string `json:"comment,omitempty" yaml:"comment,omitempty" xml:"comment,omitempty"`
Tablespace string `json:"tablespace,omitempty" yaml:"tablespace,omitempty" xml:"tablespace,omitempty"`
ColumnCount int `json:"column_count" yaml:"column_count" xml:"column_count"`
ConstraintCount int `json:"constraint_count" yaml:"constraint_count" xml:"constraint_count"`
IndexCount int `json:"index_count" yaml:"index_count" xml:"index_count"`
}
// ToFlatTables converts a Database to a slice of FlatTables
func (d *Database) ToFlatTables() []*FlatTable {
flatTables := make([]*FlatTable, 0)
for _, schema := range d.Schemas {
for _, table := range schema.Tables {
flatTables = append(flatTables, &FlatTable{
DatabaseName: d.Name,
SchemaName: schema.Name,
TableName: table.Name,
FullyQualifiedName: fmt.Sprintf("%s.%s.%s", d.Name, schema.Name, table.Name),
Description: table.Description,
Comment: table.Comment,
Tablespace: table.Tablespace,
ColumnCount: len(table.Columns),
ConstraintCount: len(table.Constraints),
IndexCount: len(table.Indexes),
})
}
}
return flatTables
}
// FlatConstraint represents a constraint with full context
type FlatConstraint struct {
DatabaseName string `json:"database_name" yaml:"database_name" xml:"database_name"`
SchemaName string `json:"schema_name" yaml:"schema_name" xml:"schema_name"`
TableName string `json:"table_name" yaml:"table_name" xml:"table_name"`
ConstraintName string `json:"constraint_name" yaml:"constraint_name" xml:"constraint_name"`
FullyQualifiedName string `json:"fully_qualified_name" yaml:"fully_qualified_name" xml:"fully_qualified_name"` // database.schema.table.constraint
Type ConstraintType `json:"type" yaml:"type" xml:"type"`
Columns []string `json:"columns" yaml:"columns" xml:"columns"`
Expression string `json:"expression,omitempty" yaml:"expression,omitempty" xml:"expression,omitempty"`
ReferencedTable string `json:"referenced_table,omitempty" yaml:"referenced_table,omitempty" xml:"referenced_table,omitempty"`
ReferencedSchema string `json:"referenced_schema,omitempty" yaml:"referenced_schema,omitempty" xml:"referenced_schema,omitempty"`
ReferencedColumns []string `json:"referenced_columns,omitempty" yaml:"referenced_columns,omitempty" xml:"referenced_columns,omitempty"`
ReferencedFQN string `json:"referenced_fqn,omitempty" yaml:"referenced_fqn,omitempty" xml:"referenced_fqn,omitempty"` // Fully qualified reference
OnDelete string `json:"on_delete,omitempty" yaml:"on_delete,omitempty" xml:"on_delete,omitempty"`
OnUpdate string `json:"on_update,omitempty" yaml:"on_update,omitempty" xml:"on_update,omitempty"`
}
// ToFlatConstraints converts a Database to a slice of FlatConstraints
func (d *Database) ToFlatConstraints() []*FlatConstraint {
flatConstraints := make([]*FlatConstraint, 0)
for _, schema := range d.Schemas {
for _, table := range schema.Tables {
for _, constraint := range table.Constraints {
fc := &FlatConstraint{
DatabaseName: d.Name,
SchemaName: schema.Name,
TableName: table.Name,
ConstraintName: constraint.Name,
FullyQualifiedName: fmt.Sprintf("%s.%s.%s.%s", d.Name, schema.Name, table.Name, constraint.Name),
Type: constraint.Type,
Columns: constraint.Columns,
Expression: constraint.Expression,
ReferencedTable: constraint.ReferencedTable,
ReferencedSchema: constraint.ReferencedSchema,
ReferencedColumns: constraint.ReferencedColumns,
OnDelete: constraint.OnDelete,
OnUpdate: constraint.OnUpdate,
}
// Build fully qualified reference name for foreign keys
if constraint.Type == ForeignKeyConstraint && constraint.ReferencedTable != "" {
fc.ReferencedFQN = fmt.Sprintf("%s.%s.%s", d.Name, constraint.ReferencedSchema, constraint.ReferencedTable)
}
flatConstraints = append(flatConstraints, fc)
}
}
}
return flatConstraints
}
// FlatRelationship represents a relationship with full context
type FlatRelationship struct {
DatabaseName string `json:"database_name" yaml:"database_name" xml:"database_name"`
RelationshipName string `json:"relationship_name" yaml:"relationship_name" xml:"relationship_name"`
Type RelationType `json:"type" yaml:"type" xml:"type"`
FromFQN string `json:"from_fqn" yaml:"from_fqn" xml:"from_fqn"` // database.schema.table
ToFQN string `json:"to_fqn" yaml:"to_fqn" xml:"to_fqn"` // database.schema.table
FromTable string `json:"from_table" yaml:"from_table" xml:"from_table"`
FromSchema string `json:"from_schema" yaml:"from_schema" xml:"from_schema"`
ToTable string `json:"to_table" yaml:"to_table" xml:"to_table"`
ToSchema string `json:"to_schema" yaml:"to_schema" xml:"to_schema"`
ForeignKey string `json:"foreign_key" yaml:"foreign_key" xml:"foreign_key"`
ThroughTableFQN string `json:"through_table_fqn,omitempty" yaml:"through_table_fqn,omitempty" xml:"through_table_fqn,omitempty"`
Description string `json:"description,omitempty" yaml:"description,omitempty" xml:"description,omitempty"`
}
// ToFlatRelationships converts a Database to a slice of FlatRelationships
func (d *Database) ToFlatRelationships() []*FlatRelationship {
flatRelationships := make([]*FlatRelationship, 0)
for _, schema := range d.Schemas {
for _, table := range schema.Tables {
for _, relationship := range table.Relationships {
fr := &FlatRelationship{
DatabaseName: d.Name,
RelationshipName: relationship.Name,
Type: relationship.Type,
FromFQN: fmt.Sprintf("%s.%s.%s", d.Name, relationship.FromSchema, relationship.FromTable),
ToFQN: fmt.Sprintf("%s.%s.%s", d.Name, relationship.ToSchema, relationship.ToTable),
FromTable: relationship.FromTable,
FromSchema: relationship.FromSchema,
ToTable: relationship.ToTable,
ToSchema: relationship.ToSchema,
ForeignKey: relationship.ForeignKey,
Description: relationship.Description,
}
// Add through table FQN for many-to-many relationships
if relationship.ThroughTable != "" {
fr.ThroughTableFQN = fmt.Sprintf("%s.%s.%s", d.Name, relationship.ThroughSchema, relationship.ThroughTable)
}
flatRelationships = append(flatRelationships, fr)
}
}
}
return flatRelationships
}

6
pkg/models/interface.go Normal file
View File

@@ -0,0 +1,6 @@
package models
// SQLNamer interface for types that can provide a normalized sql name
type SQLNamer interface {
SQLName() string
}

View File

@@ -1,5 +1,7 @@
package models
import "strings"
type DatabaseType string
const (
@@ -11,24 +13,39 @@ const (
// Database represents the complete database schema
type Database struct {
Name string `json:"name" yaml:"name"`
Description string `json:"description,omitempty" yaml:"description,omitempty" xml:"description,omitempty"`
Schemas []*Schema `json:"schemas" yaml:"schemas" xml:"schemas"`
Comment string `json:"comment,omitempty" yaml:"comment,omitempty" xml:"comment,omitempty"`
DatabaseType DatabaseType `json:"database_type,omitempty" yaml:"database_type,omitempty" xml:"database_type,omitempty"`
DatabaseVersion string `json:"database_version,omitempty" yaml:"database_version,omitempty" xml:"database_version,omitempty"`
SourceFormat string `json:"source_format,omitempty" yaml:"source_format,omitempty" xml:"source_format,omitempty"` //Source Format of the database.
}
// SQLNamer returns the database name in lowercase
func (d *Database) SQLName() string {
return strings.ToLower(d.Name)
}
type Schema struct {
Name string `json:"name" yaml:"name" xml:"name"`
Tables []*Table `json:"tables" yaml:"tables" xml:"-"`
Owner string `json:"owner" yaml:"owner" xml:"owner"`
Permissions map[string]string `json:"permissions,omitempty" yaml:"permissions,omitempty" xml:"-"`
Comment string `json:"comment,omitempty" yaml:"comment,omitempty" xml:"comment,omitempty"`
Metadata map[string]interface{} `json:"metadata,omitempty" yaml:"metadata,omitempty" xml:"-"`
Scripts []*Script `json:"scripts,omitempty" yaml:"scripts,omitempty" xml:"scripts,omitempty"`
Name string `json:"name" yaml:"name" xml:"name"`
Description string `json:"description,omitempty" yaml:"description,omitempty" xml:"description,omitempty"`
Tables []*Table `json:"tables" yaml:"tables" xml:"-"`
Owner string `json:"owner" yaml:"owner" xml:"owner"`
Permissions map[string]string `json:"permissions,omitempty" yaml:"permissions,omitempty" xml:"-"`
Comment string `json:"comment,omitempty" yaml:"comment,omitempty" xml:"comment,omitempty"`
Metadata map[string]any `json:"metadata,omitempty" yaml:"metadata,omitempty" xml:"-"`
Scripts []*Script `json:"scripts,omitempty" yaml:"scripts,omitempty" xml:"scripts,omitempty"`
Sequence uint `json:"sequence,omitempty" yaml:"sequence,omitempty" xml:"sequence,omitempty"`
}
// SQLName returns the schema name in lowercase
func (d *Schema) SQLName() string {
return strings.ToLower(d.Name)
}
type Table struct {
Name string `json:"name" yaml:"name" xml:"name"`
Description string `json:"description,omitempty" yaml:"description,omitempty" xml:"description,omitempty"`
Schema string `json:"schema" yaml:"schema" xml:"schema"`
Columns map[string]*Column `json:"columns" yaml:"columns" xml:"-"`
Constraints map[string]*Constraint `json:"constraints" yaml:"constraints" xml:"-"`
@@ -36,7 +53,13 @@ type Table struct {
Relationships map[string]*Relationship `json:"relationships,omitempty" yaml:"relationships,omitempty" xml:"-"`
Comment string `json:"comment,omitempty" yaml:"comment,omitempty" xml:"comment,omitempty"`
Tablespace string `json:"tablespace,omitempty" yaml:"tablespace,omitempty" xml:"tablespace,omitempty"`
Metadata map[string]interface{} `json:"metadata,omitempty" yaml:"metadata,omitempty" xml:"-"`
Metadata map[string]any `json:"metadata,omitempty" yaml:"metadata,omitempty" xml:"-"`
Sequence uint `json:"sequence,omitempty" yaml:"sequence,omitempty" xml:"sequence,omitempty"`
}
// SQLName returns the table name in lowercase
func (d *Table) SQLName() string {
return strings.ToLower(d.Name)
}
func (m Table) GetPrimaryKey() *Column {
@@ -61,32 +84,46 @@ func (m Table) GetForeignKeys() []*Constraint {
// Column represents a table column
type Column struct {
Name string `json:"name" yaml:"name" xml:"name"`
Table string `json:"table" yaml:"table" xml:"table"`
Schema string `json:"schema" yaml:"schema" xml:"schema"`
Type string `json:"type" yaml:"type" xml:"type"`
Length int `json:"length,omitempty" yaml:"length,omitempty" xml:"length,omitempty"`
Precision int `json:"precision,omitempty" yaml:"precision,omitempty" xml:"precision,omitempty"`
Scale int `json:"scale,omitempty" yaml:"scale,omitempty" xml:"scale,omitempty"`
NotNull bool `json:"not_null" yaml:"not_null" xml:"not_null"`
Default interface{} `json:"default,omitempty" yaml:"default,omitempty" xml:"default,omitempty"`
AutoIncrement bool `json:"auto_increment" yaml:"auto_increment" xml:"auto_increment"`
IsPrimaryKey bool `json:"is_primary_key" yaml:"is_primary_key" xml:"is_primary_key"`
Comment string `json:"comment,omitempty" yaml:"comment,omitempty" xml:"comment,omitempty"`
Collation string `json:"collation,omitempty" yaml:"collation,omitempty" xml:"collation,omitempty"`
Name string `json:"name" yaml:"name" xml:"name"`
Description string `json:"description,omitempty" yaml:"description,omitempty" xml:"description,omitempty"`
Table string `json:"table" yaml:"table" xml:"table"`
Schema string `json:"schema" yaml:"schema" xml:"schema"`
Type string `json:"type" yaml:"type" xml:"type"`
Length int `json:"length,omitempty" yaml:"length,omitempty" xml:"length,omitempty"`
Precision int `json:"precision,omitempty" yaml:"precision,omitempty" xml:"precision,omitempty"`
Scale int `json:"scale,omitempty" yaml:"scale,omitempty" xml:"scale,omitempty"`
NotNull bool `json:"not_null" yaml:"not_null" xml:"not_null"`
Default any `json:"default,omitempty" yaml:"default,omitempty" xml:"default,omitempty"`
AutoIncrement bool `json:"auto_increment" yaml:"auto_increment" xml:"auto_increment"`
IsPrimaryKey bool `json:"is_primary_key" yaml:"is_primary_key" xml:"is_primary_key"`
Comment string `json:"comment,omitempty" yaml:"comment,omitempty" xml:"comment,omitempty"`
Collation string `json:"collation,omitempty" yaml:"collation,omitempty" xml:"collation,omitempty"`
Sequence uint `json:"sequence,omitempty" yaml:"sequence,omitempty" xml:"sequence,omitempty"`
}
// SQLName returns the table name in lowercase
func (d *Column) SQLName() string {
return strings.ToLower(d.Name)
}
type Index struct {
Name string `json:"name" yaml:"name" xml:"name"`
Table string `json:"table,omitempty" yaml:"table,omitempty" xml:"table,omitempty"`
Schema string `json:"schema,omitempty" yaml:"schema,omitempty" xml:"schema,omitempty"`
Columns []string `json:"columns" yaml:"columns" xml:"columns"`
Unique bool `json:"unique" yaml:"unique" xml:"unique"`
Type string `json:"type" yaml:"type" xml:"type"` // btree, hash, gin, gist, etc.
Where string `json:"where,omitempty" yaml:"where,omitempty" xml:"where,omitempty"` // partial index condition
Concurrent bool `json:"concurrent,omitempty" yaml:"concurrent,omitempty" xml:"concurrent,omitempty"`
Include []string `json:"include,omitempty" yaml:"include,omitempty" xml:"include,omitempty"` // INCLUDE columns
Comment string `json:"comment,omitempty" yaml:"comment,omitempty" xml:"comment,omitempty"`
Name string `json:"name" yaml:"name" xml:"name"`
Description string `json:"description,omitempty" yaml:"description,omitempty" xml:"description,omitempty"`
Table string `json:"table,omitempty" yaml:"table,omitempty" xml:"table,omitempty"`
Schema string `json:"schema,omitempty" yaml:"schema,omitempty" xml:"schema,omitempty"`
Columns []string `json:"columns" yaml:"columns" xml:"columns"`
Unique bool `json:"unique" yaml:"unique" xml:"unique"`
Type string `json:"type" yaml:"type" xml:"type"` // btree, hash, gin, gist, etc.
Where string `json:"where,omitempty" yaml:"where,omitempty" xml:"where,omitempty"` // partial index condition
Concurrent bool `json:"concurrent,omitempty" yaml:"concurrent,omitempty" xml:"concurrent,omitempty"`
Include []string `json:"include,omitempty" yaml:"include,omitempty" xml:"include,omitempty"` // INCLUDE columns
Comment string `json:"comment,omitempty" yaml:"comment,omitempty" xml:"comment,omitempty"`
Sequence uint `json:"sequence,omitempty" yaml:"sequence,omitempty" xml:"sequence,omitempty"`
}
// SQLName returns the Indexin lowercase
func (d *Index) SQLName() string {
return strings.ToLower(d.Name)
}
type RelationType string
@@ -109,6 +146,12 @@ type Relationship struct {
ThroughTable string `json:"through_table,omitempty" yaml:"through_table,omitempty" xml:"through_table,omitempty"` // For many-to-many
ThroughSchema string `json:"through_schema,omitempty" yaml:"through_schema,omitempty" xml:"through_schema,omitempty"`
Description string `json:"description,omitempty" yaml:"description,omitempty" xml:"description,omitempty"`
Sequence uint `json:"sequence,omitempty" yaml:"sequence,omitempty" xml:"sequence,omitempty"`
}
// SQLName returns the Relationship lowercase
func (d *Relationship) SQLName() string {
return strings.ToLower(d.Name)
}
type Constraint struct {
@@ -125,6 +168,11 @@ type Constraint struct {
OnUpdate string `json:"on_update" yaml:"on_update" xml:"on_update"`
Deferrable bool `json:"deferrable,omitempty" yaml:"deferrable,omitempty" xml:"deferrable,omitempty"`
InitiallyDeferred bool `json:"initially_deferred,omitempty" yaml:"initially_deferred,omitempty" xml:"initially_deferred,omitempty"`
Sequence uint `json:"sequence,omitempty" yaml:"sequence,omitempty" xml:"sequence,omitempty"`
}
func (d *Constraint) SQLName() string {
return strings.ToLower(d.Name)
}
type ConstraintType string
@@ -146,4 +194,88 @@ type Script struct {
Schema string `json:"schema,omitempty" yaml:"schema,omitempty" xml:"schema,omitempty"`
Version string `json:"version,omitempty" yaml:"version,omitempty" xml:"version,omitempty"`
Priority int `json:"priority,omitempty" yaml:"priority,omitempty" xml:"priority,omitempty"`
Sequence uint `json:"sequence,omitempty" yaml:"sequence,omitempty" xml:"sequence,omitempty"`
}
func (d *Script) SQLName() string {
return strings.ToLower(d.Name)
}
// Initialize functions
// InitDatabase initializes a new Database with empty slices
func InitDatabase(name string) *Database {
return &Database{
Name: name,
Schemas: make([]*Schema, 0),
}
}
// InitSchema initializes a new Schema with empty slices and maps
func InitSchema(name string) *Schema {
return &Schema{
Name: name,
Tables: make([]*Table, 0),
Permissions: make(map[string]string),
Metadata: make(map[string]any),
Scripts: make([]*Script, 0),
}
}
// InitTable initializes a new Table with empty maps
func InitTable(name, schema string) *Table {
return &Table{
Name: name,
Schema: schema,
Columns: make(map[string]*Column),
Constraints: make(map[string]*Constraint),
Indexes: make(map[string]*Index),
Relationships: make(map[string]*Relationship),
Metadata: make(map[string]any),
}
}
// InitColumn initializes a new Column
func InitColumn(name, table, schema string) *Column {
return &Column{
Name: name,
Table: table,
Schema: schema,
}
}
// InitIndex initializes a new Index with empty slices
func InitIndex(name string) *Index {
return &Index{
Name: name,
Columns: make([]string, 0),
Include: make([]string, 0),
}
}
// InitRelationship initializes a new Relationship with empty maps
func InitRelationship(name string, relType RelationType) *Relationship {
return &Relationship{
Name: name,
Type: relType,
Properties: make(map[string]string),
}
}
// InitConstraint initializes a new Constraint with empty slices
func InitConstraint(name string, constraintType ConstraintType) *Constraint {
return &Constraint{
Name: name,
Type: constraintType,
Columns: make([]string, 0),
ReferencedColumns: make([]string, 0),
}
}
// InitScript initializes a new Script with empty slices
func InitScript(name string) *Script {
return &Script{
Name: name,
RunAfter: make([]string, 0),
}
}

99
pkg/models/summaryview.go Normal file
View File

@@ -0,0 +1,99 @@
package models
// =============================================================================
// Summary/Compact Views - Lightweight views with essential fields
// =============================================================================
// DatabaseSummary provides a compact overview of a database
type DatabaseSummary struct {
Name string `json:"name" yaml:"name" xml:"name"`
Description string `json:"description,omitempty" yaml:"description,omitempty" xml:"description,omitempty"`
DatabaseType DatabaseType `json:"database_type,omitempty" yaml:"database_type,omitempty" xml:"database_type,omitempty"`
DatabaseVersion string `json:"database_version,omitempty" yaml:"database_version,omitempty" xml:"database_version,omitempty"`
SchemaCount int `json:"schema_count" yaml:"schema_count" xml:"schema_count"`
TotalTables int `json:"total_tables" yaml:"total_tables" xml:"total_tables"`
TotalColumns int `json:"total_columns" yaml:"total_columns" xml:"total_columns"`
}
// ToSummary converts a Database to a DatabaseSummary
func (d *Database) ToSummary() *DatabaseSummary {
summary := &DatabaseSummary{
Name: d.Name,
Description: d.Description,
DatabaseType: d.DatabaseType,
DatabaseVersion: d.DatabaseVersion,
SchemaCount: len(d.Schemas),
}
// Count total tables and columns
for _, schema := range d.Schemas {
summary.TotalTables += len(schema.Tables)
for _, table := range schema.Tables {
summary.TotalColumns += len(table.Columns)
}
}
return summary
}
// SchemaSummary provides a compact overview of a schema
type SchemaSummary struct {
Name string `json:"name" yaml:"name" xml:"name"`
Description string `json:"description,omitempty" yaml:"description,omitempty" xml:"description,omitempty"`
Owner string `json:"owner" yaml:"owner" xml:"owner"`
TableCount int `json:"table_count" yaml:"table_count" xml:"table_count"`
ScriptCount int `json:"script_count" yaml:"script_count" xml:"script_count"`
TotalColumns int `json:"total_columns" yaml:"total_columns" xml:"total_columns"`
TotalConstraints int `json:"total_constraints" yaml:"total_constraints" xml:"total_constraints"`
}
// ToSummary converts a Schema to a SchemaSummary
func (s *Schema) ToSummary() *SchemaSummary {
summary := &SchemaSummary{
Name: s.Name,
Description: s.Description,
Owner: s.Owner,
TableCount: len(s.Tables),
ScriptCount: len(s.Scripts),
}
// Count columns and constraints
for _, table := range s.Tables {
summary.TotalColumns += len(table.Columns)
summary.TotalConstraints += len(table.Constraints)
}
return summary
}
// TableSummary provides a compact overview of a table
type TableSummary struct {
Name string `json:"name" yaml:"name" xml:"name"`
Schema string `json:"schema" yaml:"schema" xml:"schema"`
Description string `json:"description,omitempty" yaml:"description,omitempty" xml:"description,omitempty"`
ColumnCount int `json:"column_count" yaml:"column_count" xml:"column_count"`
ConstraintCount int `json:"constraint_count" yaml:"constraint_count" xml:"constraint_count"`
IndexCount int `json:"index_count" yaml:"index_count" xml:"index_count"`
RelationshipCount int `json:"relationship_count" yaml:"relationship_count" xml:"relationship_count"`
HasPrimaryKey bool `json:"has_primary_key" yaml:"has_primary_key" xml:"has_primary_key"`
ForeignKeyCount int `json:"foreign_key_count" yaml:"foreign_key_count" xml:"foreign_key_count"`
}
// ToSummary converts a Table to a TableSummary
func (t *Table) ToSummary() *TableSummary {
summary := &TableSummary{
Name: t.Name,
Schema: t.Schema,
Description: t.Description,
ColumnCount: len(t.Columns),
ConstraintCount: len(t.Constraints),
IndexCount: len(t.Indexes),
RelationshipCount: len(t.Relationships),
HasPrimaryKey: t.GetPrimaryKey() != nil,
}
// Count foreign keys
summary.ForeignKeyCount = len(t.GetForeignKeys())
return summary
}

154
pkg/pgsql/datatypes.go Normal file
View File

@@ -0,0 +1,154 @@
package pgsql
import "strings"
var GoToStdTypes = map[string]string{
"bool": "boolean",
"int64": "integer",
"int": "integer",
"int8": "integer",
"int16": "integer",
"int32": "integer",
"uint": "integer",
"uint8": "integer",
"uint16": "integer",
"uint32": "integer",
"uint64": "integer",
"uintptr": "integer",
"znullint64": "integer",
"znullint32": "integer",
"znullbyte": "integer",
"float64": "double",
"float32": "double",
"complex64": "double",
"complex128": "double",
"customfloat64": "double",
"string": "string",
"Pointer": "integer",
"[]byte": "blob",
"customdate": "string",
"customtime": "string",
"customtimestamp": "string",
"sqlfloat64": "double",
"sqlfloat16": "double",
"sqluuid": "uuid",
"sqljsonb": "jsonb",
"sqljson": "json",
"sqlint64": "bigint",
"sqlint32": "integer",
"sqlint16": "integer",
"sqlbool": "boolean",
"sqlstring": "string",
"nullablejsonb": "jsonb",
"nullablejson": "json",
"nullableuuid": "uuid",
"sqldate": "date",
"sqltime": "time",
"sqltimestamp": "timestamp",
}
var GoToPGSQLTypes = map[string]string{
"bool": "boolean",
"int64": "bigint",
"int": "integer",
"int8": "smallint",
"int16": "smallint",
"int32": "integer",
"uint": "integer",
"uint8": "smallint",
"uint16": "smallint",
"uint32": "integer",
"uint64": "bigint",
"uintptr": "bigint",
"znullint64": "bigint",
"znullint32": "integer",
"znullbyte": "integer",
"float64": "double precision",
"float32": "real",
"complex64": "double precision",
"complex128": "double precision",
"customfloat64": "double precisio",
"string": "text",
"Pointer": "bigint",
"[]byte": "bytea",
"customdate": "date",
"customtime": "time",
"customtimestamp": "timestamp",
"sqlfloat64": "double precision",
"sqlfloat16": "double precision",
"sqluuid": "uuid",
"sqljsonb": "jsonb",
"sqljson": "json",
"sqlint64": "bigint",
"sqlint32": "integer",
"sqlint16": "integer",
"sqlbool": "boolean",
"sqlstring": "string",
"nullablejsonb": "jsonb",
"nullablejson": "json",
"nullableuuid": "uuid",
"sqldate": "date",
"sqltime": "time",
"sqltimestamp": "timestamp",
"citext": "citext",
}
func ValidSQLType(sqltype string) bool {
for _, sql := range GoToPGSQLTypes {
if strings.EqualFold(sql, sqltype) {
return true
}
}
for _, sql := range GoToStdTypes {
if strings.EqualFold(sql, sqltype) {
return true
}
}
return false
}
func GetSQLType(anytype string) string {
for gotype, sql := range GoToPGSQLTypes {
if strings.EqualFold(gotype, anytype) || strings.EqualFold(sql, anytype) {
return sql
}
}
for gotype, sql := range GoToStdTypes {
if strings.EqualFold(gotype, anytype) || strings.EqualFold(sql, anytype) {
return sql
}
}
return "text"
}
func ConvertSQLType(anytype string) string {
for gotype, sql := range GoToPGSQLTypes {
if strings.EqualFold(gotype, anytype) || strings.EqualFold(sql, anytype) {
return sql
}
}
for gotype, sql := range GoToStdTypes {
if strings.EqualFold(gotype, anytype) || strings.EqualFold(sql, anytype) {
return sql
}
}
return anytype
}
func IsGoType(pTypeName string) bool {
for k := range GoToStdTypes {
if strings.EqualFold(pTypeName, k) {
return true
}
}
return false
}
func GetStdTypeFromGo(pTypeName string) string {
for k, s := range GoToStdTypes {
if strings.EqualFold(pTypeName, k) {
return s
}
}
return pTypeName
}

62
pkg/pgsql/keywords.go Normal file
View File

@@ -0,0 +1,62 @@
package pgsql
// postgresKeywords contains PostgreSQL reserved keywords that need to be handled
var postgresKeywords = map[string]bool{
"abort": true, "action": true, "add": true, "after": true, "all": true, "alter": true, "and": true,
"any": true, "array": true, "as": true, "asc": true, "asymmetric": true, "at": true, "authorization": true,
"begin": true, "between": true, "bigint": true, "binary": true, "bit": true, "boolean": true, "both": true,
"by": true, "cascade": true, "case": true, "cast": true, "char": true, "character": true, "check": true,
"collate": true, "collation": true, "column": true, "commit": true, "concurrently": true, "constraint": true,
"create": true, "cross": true, "current": true, "current_catalog": true, "current_date": true,
"current_role": true, "current_schema": true, "current_time": true, "current_timestamp": true,
"current_user": true, "cursor": true, "cycle": true, "date": true, "day": true, "deallocate": true,
"dec": true, "decimal": true, "declare": true, "default": true, "deferrable": true, "deferred": true,
"delete": true, "desc": true, "distinct": true, "do": true, "drop": true, "each": true, "else": true,
"end": true, "enum": true, "escape": true, "except": true, "exclude": true, "execute": true, "exists": true,
"extract": true, "false": true, "fetch": true, "filter": true, "first": true, "float": true, "following": true,
"for": true, "foreign": true, "from": true, "full": true, "function": true, "global": true, "grant": true,
"group": true, "having": true, "hour": true, "identity": true, "if": true, "ilike": true, "in": true,
"include": true, "increment": true, "index": true, "inherit": true, "initially": true, "inner": true,
"inout": true, "input": true, "insert": true, "instead": true, "int": true, "integer": true, "intersect": true,
"interval": true, "into": true, "is": true, "isolation": true, "join": true, "key": true, "language": true,
"large": true, "last": true, "lateral": true, "leading": true, "left": true, "level": true, "like": true,
"limit": true, "listen": true, "load": true, "local": true, "localtime": true, "localtimestamp": true,
"location": true, "lock": true, "match": true, "minute": true, "mode": true, "month": true, "move": true,
"name": true, "national": true, "natural": true, "nchar": true, "new": true, "next": true, "no": true,
"none": true, "not": true, "nothing": true, "notify": true, "null": true, "nulls": true, "numeric": true,
"object": true, "of": true, "off": true, "offset": true, "oids": true, "old": true, "on": true, "only": true,
"option": true, "or": true, "order": true, "ordinality": true, "out": true, "outer": true, "over": true,
"overlaps": true, "overlay": true, "owned": true, "owner": true, "partial": true, "partition": true,
"placing": true, "position": true, "preceding": true, "precision": true, "prepare": true, "prepared": true,
"preserve": true, "primary": true, "prior": true, "privileges": true, "procedural": true, "procedure": true,
"range": true, "read": true, "real": true, "reassign": true, "recheck": true, "recursive": true, "ref": true,
"references": true, "refresh": true, "reindex": true, "relative": true, "release": true, "rename": true,
"repeatable": true, "replace": true, "replica": true, "reset": true, "restart": true, "restrict": true,
"returning": true, "returns": true, "revoke": true, "right": true, "role": true, "rollback": true,
"rollup": true, "row": true, "rows": true, "rule": true, "savepoint": true, "schema": true, "scroll": true,
"search": true, "second": true, "security": true, "select": true, "sequence": true, "serializable": true,
"session": true, "set": true, "setof": true, "share": true, "show": true, "similar": true, "simple": true,
"smallint": true, "snapshot": true, "some": true, "sql": true, "stable": true, "standalone": true,
"start": true, "statement": true, "statistics": true, "stdin": true, "stdout": true, "storage": true,
"strict": true, "strip": true, "substring": true, "symmetric": true, "sysid": true, "system": true,
"table": true, "tables": true, "tablespace": true, "temp": true, "template": true, "temporary": true,
"text": true, "then": true, "time": true, "timestamp": true, "to": true, "trailing": true, "transaction": true,
"transform": true, "treat": true, "trigger": true, "trim": true, "true": true, "truncate": true,
"trusted": true, "type": true, "types": true, "unbounded": true, "uncommitted": true, "unencrypted": true,
"union": true, "unique": true, "unknown": true, "unlisten": true, "unlogged": true, "until": true,
"update": true, "user": true, "using": true, "vacuum": true, "valid": true, "validate": true, "validator": true,
"value": true, "values": true, "varchar": true, "variadic": true, "varying": true, "verbose": true,
"version": true, "view": true, "volatile": true, "when": true, "where": true, "whitespace": true,
"window": true, "with": true, "within": true, "without": true, "work": true, "wrapper": true,
"write": true, "xml": true, "xmlattributes": true, "xmlconcat": true, "xmlelement": true, "xmlexists": true,
"xmlforest": true, "xmlparse": true, "xmlpi": true, "xmlroot": true, "xmlserialize": true, "year": true,
"yes": true, "zone": true,
}
func GetPostgresKeywords() []string {
lst := []string{}
for k := range postgresKeywords {
lst = append(lst, k)
}
return lst
}

400
pkg/readers/dbml/reader.go Normal file
View File

@@ -0,0 +1,400 @@
package dbml
import (
"bufio"
"fmt"
"os"
"regexp"
"strings"
"git.warky.dev/wdevs/relspecgo/pkg/models"
"git.warky.dev/wdevs/relspecgo/pkg/readers"
)
// Reader implements the readers.Reader interface for DBML format
type Reader struct {
options *readers.ReaderOptions
}
// NewReader creates a new DBML reader with the given options
func NewReader(options *readers.ReaderOptions) *Reader {
return &Reader{
options: options,
}
}
// ReadDatabase reads and parses DBML input, returning a Database model
func (r *Reader) ReadDatabase() (*models.Database, error) {
if r.options.FilePath == "" {
return nil, fmt.Errorf("file path is required for DBML reader")
}
content, err := os.ReadFile(r.options.FilePath)
if err != nil {
return nil, fmt.Errorf("failed to read file: %w", err)
}
return r.parseDBML(string(content))
}
// ReadSchema reads and parses DBML input, returning a Schema model
func (r *Reader) ReadSchema() (*models.Schema, error) {
db, err := r.ReadDatabase()
if err != nil {
return nil, err
}
if len(db.Schemas) == 0 {
return nil, fmt.Errorf("no schemas found in DBML")
}
// Return the first schema
return db.Schemas[0], nil
}
// ReadTable reads and parses DBML input, returning a Table model
func (r *Reader) ReadTable() (*models.Table, error) {
schema, err := r.ReadSchema()
if err != nil {
return nil, err
}
if len(schema.Tables) == 0 {
return nil, fmt.Errorf("no tables found in DBML")
}
// Return the first table
return schema.Tables[0], nil
}
// parseDBML parses DBML content and returns a Database model
func (r *Reader) parseDBML(content string) (*models.Database, error) {
db := models.InitDatabase("database")
if r.options.Metadata != nil {
if name, ok := r.options.Metadata["name"].(string); ok {
db.Name = name
}
}
scanner := bufio.NewScanner(strings.NewReader(content))
schemaMap := make(map[string]*models.Schema)
var currentTable *models.Table
var currentSchema string
var inIndexes bool
var inTable bool
tableRegex := regexp.MustCompile(`^Table\s+([a-zA-Z0-9_.]+)\s*{`)
refRegex := regexp.MustCompile(`^Ref:\s+(.+)`)
for scanner.Scan() {
line := strings.TrimSpace(scanner.Text())
// Skip empty lines and comments
if line == "" || strings.HasPrefix(line, "//") {
continue
}
// Parse Table definition
if matches := tableRegex.FindStringSubmatch(line); matches != nil {
tableName := matches[1]
parts := strings.Split(tableName, ".")
if len(parts) == 2 {
currentSchema = parts[0]
tableName = parts[1]
} else {
currentSchema = "public"
}
// Ensure schema exists
if _, exists := schemaMap[currentSchema]; !exists {
schemaMap[currentSchema] = models.InitSchema(currentSchema)
}
currentTable = models.InitTable(tableName, currentSchema)
inTable = true
inIndexes = false
continue
}
// End of table definition
if inTable && line == "}" {
if currentTable != nil && currentSchema != "" {
schemaMap[currentSchema].Tables = append(schemaMap[currentSchema].Tables, currentTable)
currentTable = nil
}
inTable = false
inIndexes = false
continue
}
// Parse indexes section
if inTable && strings.HasPrefix(line, "indexes") {
inIndexes = true
continue
}
// End of indexes section
if inIndexes && line == "}" {
inIndexes = false
continue
}
// Parse index definition
if inIndexes && currentTable != nil {
index := r.parseIndex(line, currentTable.Name, currentSchema)
if index != nil {
currentTable.Indexes[index.Name] = index
}
continue
}
// Parse table note
if inTable && currentTable != nil && strings.HasPrefix(line, "Note:") {
note := strings.TrimPrefix(line, "Note:")
note = strings.Trim(note, " '\"")
currentTable.Description = note
continue
}
// Parse column definition
if inTable && !inIndexes && currentTable != nil {
column := r.parseColumn(line, currentTable.Name, currentSchema)
if column != nil {
currentTable.Columns[column.Name] = column
}
continue
}
// Parse Ref (relationship/foreign key)
if matches := refRegex.FindStringSubmatch(line); matches != nil {
constraint := r.parseRef(matches[1])
if constraint != nil {
// Find the table and add the constraint
for _, schema := range schemaMap {
for _, table := range schema.Tables {
if table.Schema == constraint.Schema && table.Name == constraint.Table {
table.Constraints[constraint.Name] = constraint
break
}
}
}
}
continue
}
}
// Add schemas to database
for _, schema := range schemaMap {
db.Schemas = append(db.Schemas, schema)
}
return db, nil
}
// parseColumn parses a DBML column definition
func (r *Reader) parseColumn(line, tableName, schemaName string) *models.Column {
// Format: column_name type [attributes] // comment
parts := strings.Fields(line)
if len(parts) < 2 {
return nil
}
columnName := parts[0]
columnType := parts[1]
column := models.InitColumn(columnName, tableName, schemaName)
column.Type = columnType
// Parse attributes in brackets
if strings.Contains(line, "[") && strings.Contains(line, "]") {
attrStart := strings.Index(line, "[")
attrEnd := strings.Index(line, "]")
if attrStart < attrEnd {
attrs := line[attrStart+1 : attrEnd]
attrList := strings.Split(attrs, ",")
for _, attr := range attrList {
attr = strings.TrimSpace(attr)
if strings.Contains(attr, "primary key") || attr == "pk" {
column.IsPrimaryKey = true
column.NotNull = true
} else if strings.Contains(attr, "not null") {
column.NotNull = true
} else if attr == "increment" {
column.AutoIncrement = true
} else if strings.HasPrefix(attr, "default:") {
defaultVal := strings.TrimSpace(strings.TrimPrefix(attr, "default:"))
column.Default = strings.Trim(defaultVal, "'\"")
} else if attr == "unique" {
// Could create a unique constraint here
}
}
}
}
// Parse inline comment
if strings.Contains(line, "//") {
commentStart := strings.Index(line, "//")
column.Comment = strings.TrimSpace(line[commentStart+2:])
}
return column
}
// parseIndex parses a DBML index definition
func (r *Reader) parseIndex(line, tableName, schemaName string) *models.Index {
// Format: (columns) [attributes]
if !strings.Contains(line, "(") || !strings.Contains(line, ")") {
return nil
}
colStart := strings.Index(line, "(")
colEnd := strings.Index(line, ")")
if colStart >= colEnd {
return nil
}
columnsStr := line[colStart+1 : colEnd]
columns := strings.Split(columnsStr, ",")
for i := range columns {
columns[i] = strings.TrimSpace(columns[i])
}
index := models.InitIndex("")
index.Table = tableName
index.Schema = schemaName
index.Columns = columns
// Parse attributes
if strings.Contains(line, "[") && strings.Contains(line, "]") {
attrStart := strings.Index(line, "[")
attrEnd := strings.Index(line, "]")
if attrStart < attrEnd {
attrs := line[attrStart+1 : attrEnd]
attrList := strings.Split(attrs, ",")
for _, attr := range attrList {
attr = strings.TrimSpace(attr)
if attr == "unique" {
index.Unique = true
} else if strings.HasPrefix(attr, "name:") {
name := strings.TrimSpace(strings.TrimPrefix(attr, "name:"))
index.Name = strings.Trim(name, "'\"")
} else if strings.HasPrefix(attr, "type:") {
indexType := strings.TrimSpace(strings.TrimPrefix(attr, "type:"))
index.Type = strings.Trim(indexType, "'\"")
}
}
}
}
// Generate name if not provided
if index.Name == "" {
index.Name = fmt.Sprintf("idx_%s_%s", tableName, strings.Join(columns, "_"))
}
return index
}
// parseRef parses a DBML Ref (foreign key relationship)
func (r *Reader) parseRef(refStr string) *models.Constraint {
// Format: schema.table.(columns) > schema.table.(columns) [actions]
// Split by relationship operator (>, <, -, etc.)
var fromPart, toPart string
for _, op := range []string{">", "<", "-"} {
if strings.Contains(refStr, op) {
parts := strings.Split(refStr, op)
if len(parts) == 2 {
fromPart = strings.TrimSpace(parts[0])
toPart = strings.TrimSpace(parts[1])
break
}
}
}
if fromPart == "" || toPart == "" {
return nil
}
// Remove actions part if present
if strings.Contains(toPart, "[") {
toPart = strings.TrimSpace(toPart[:strings.Index(toPart, "[")])
}
// Parse from table and column
fromSchema, fromTable, fromColumns := r.parseTableRef(fromPart)
toSchema, toTable, toColumns := r.parseTableRef(toPart)
if fromTable == "" || toTable == "" {
return nil
}
constraint := models.InitConstraint(
fmt.Sprintf("fk_%s_%s", fromTable, toTable),
models.ForeignKeyConstraint,
)
constraint.Schema = fromSchema
constraint.Table = fromTable
constraint.Columns = fromColumns
constraint.ReferencedSchema = toSchema
constraint.ReferencedTable = toTable
constraint.ReferencedColumns = toColumns
// Parse actions if present
if strings.Contains(refStr, "[") && strings.Contains(refStr, "]") {
actStart := strings.Index(refStr, "[")
actEnd := strings.Index(refStr, "]")
if actStart < actEnd {
actions := refStr[actStart+1 : actEnd]
actionList := strings.Split(actions, ",")
for _, action := range actionList {
action = strings.TrimSpace(action)
if strings.HasPrefix(action, "ondelete:") {
constraint.OnDelete = strings.TrimSpace(strings.TrimPrefix(action, "ondelete:"))
} else if strings.HasPrefix(action, "onupdate:") {
constraint.OnUpdate = strings.TrimSpace(strings.TrimPrefix(action, "onupdate:"))
}
}
}
}
return constraint
}
// parseTableRef parses a table reference like "schema.table.(column1, column2)"
func (r *Reader) parseTableRef(ref string) (schema, table string, columns []string) {
// Extract columns if present
if strings.Contains(ref, "(") && strings.Contains(ref, ")") {
colStart := strings.Index(ref, "(")
colEnd := strings.Index(ref, ")")
if colStart < colEnd {
columnsStr := ref[colStart+1 : colEnd]
for _, col := range strings.Split(columnsStr, ",") {
columns = append(columns, strings.TrimSpace(col))
}
}
ref = ref[:colStart]
}
// Parse schema and table
parts := strings.Split(strings.TrimSpace(ref), ".")
if len(parts) == 2 {
schema = parts[0]
table = parts[1]
} else if len(parts) == 1 {
schema = "public"
table = parts[0]
}
return
}

486
pkg/readers/dctx/reader.go Normal file
View File

@@ -0,0 +1,486 @@
package dctx
import (
"encoding/xml"
"fmt"
"os"
"strings"
"git.warky.dev/wdevs/relspecgo/pkg/models"
"git.warky.dev/wdevs/relspecgo/pkg/readers"
)
// Reader implements the readers.Reader interface for DCTX format
type Reader struct {
options *readers.ReaderOptions
}
// NewReader creates a new DCTX reader with the given options
func NewReader(options *readers.ReaderOptions) *Reader {
return &Reader{
options: options,
}
}
// ReadDatabase reads and parses DCTX input, returning a Database model
func (r *Reader) ReadDatabase() (*models.Database, error) {
if r.options.FilePath == "" {
return nil, fmt.Errorf("file path is required for DCTX reader")
}
data, err := os.ReadFile(r.options.FilePath)
if err != nil {
return nil, fmt.Errorf("failed to read file: %w", err)
}
var dctx DCTXDictionary
if err := xml.Unmarshal(data, &dctx); err != nil {
return nil, fmt.Errorf("failed to parse DCTX XML: %w", err)
}
return r.convertToDatabase(&dctx)
}
// ReadSchema reads and parses DCTX input, returning a Schema model
func (r *Reader) ReadSchema() (*models.Schema, error) {
db, err := r.ReadDatabase()
if err != nil {
return nil, err
}
if len(db.Schemas) == 0 {
return nil, fmt.Errorf("no schemas found in DCTX")
}
return db.Schemas[0], nil
}
// ReadTable reads and parses DCTX input, returning a Table model
func (r *Reader) ReadTable() (*models.Table, error) {
schema, err := r.ReadSchema()
if err != nil {
return nil, err
}
if len(schema.Tables) == 0 {
return nil, fmt.Errorf("no tables found in DCTX")
}
return schema.Tables[0], nil
}
// convertToDatabase converts a DCTX dictionary to a Database model
func (r *Reader) convertToDatabase(dctx *DCTXDictionary) (*models.Database, error) {
dbName := dctx.Name
if dbName == "" {
dbName = "database"
}
db := models.InitDatabase(dbName)
schema := models.InitSchema("public")
// Create GUID mappings for tables and keys
tableGuidMap := make(map[string]string) // GUID -> table name
keyGuidMap := make(map[string]*DCTXKey) // GUID -> key definition
keyTableMap := make(map[string]string) // key GUID -> table name
fieldGuidMaps := make(map[string]map[string]string) // table name -> field GUID -> field name
// First pass: build GUID mappings
for _, dctxTable := range dctx.Tables {
if !r.hasSQLOption(&dctxTable) {
continue
}
tableName := r.sanitizeName(dctxTable.Name)
tableGuidMap[dctxTable.Guid] = tableName
// Map keys to their table
for _, dctxKey := range dctxTable.Keys {
keyGuidMap[dctxKey.Guid] = &dctxKey
keyTableMap[dctxKey.Guid] = tableName
}
}
// Process tables - only include tables with SQL option enabled
for _, dctxTable := range dctx.Tables {
if !r.hasSQLOption(&dctxTable) {
continue
}
table, fieldGuidMap, err := r.convertTable(&dctxTable)
if err != nil {
return nil, fmt.Errorf("failed to convert table %s: %w", dctxTable.Name, err)
}
fieldGuidMaps[table.Name] = fieldGuidMap
schema.Tables = append(schema.Tables, table)
// Process keys (indexes, primary keys)
err = r.processKeys(&dctxTable, table, fieldGuidMap)
if err != nil {
return nil, fmt.Errorf("failed to process keys for table %s: %w", dctxTable.Name, err)
}
}
// Process relations
err := r.processRelations(dctx, schema, tableGuidMap, keyGuidMap, fieldGuidMaps)
if err != nil {
return nil, fmt.Errorf("failed to process relations: %w", err)
}
db.Schemas = append(db.Schemas, schema)
return db, nil
}
// hasSQLOption checks if a DCTX table has the SQL option set to "1"
func (r *Reader) hasSQLOption(dctxTable *DCTXTable) bool {
for _, option := range dctxTable.Options {
if option.Property == "SQL" && option.PropertyValue == "1" {
return true
}
}
return false
}
// convertTable converts a DCTX table to a Table model
func (r *Reader) convertTable(dctxTable *DCTXTable) (*models.Table, map[string]string, error) {
tableName := r.sanitizeName(dctxTable.Name)
table := models.InitTable(tableName, "public")
table.Description = dctxTable.Description
fieldGuidMap := make(map[string]string)
// Process fields
for _, dctxField := range dctxTable.Fields {
// Store GUID to name mapping
if dctxField.Guid != "" && dctxField.Name != "" {
fieldGuidMap[dctxField.Guid] = r.sanitizeName(dctxField.Name)
}
columns, err := r.convertField(&dctxField, table.Name)
if err != nil {
return nil, nil, fmt.Errorf("failed to convert field %s: %w", dctxField.Name, err)
}
// Add all columns
for _, column := range columns {
table.Columns[column.Name] = column
}
}
return table, fieldGuidMap, nil
}
// convertField converts a DCTX field to Column(s)
func (r *Reader) convertField(dctxField *DCTXField, tableName string) ([]*models.Column, error) {
var columns []*models.Column
// Handle GROUP fields (nested structures)
if dctxField.DataType == "GROUP" {
for _, subField := range dctxField.Fields {
subColumns, err := r.convertField(&subField, tableName)
if err != nil {
return nil, err
}
columns = append(columns, subColumns...)
}
return columns, nil
}
// Convert single field
column := models.InitColumn(r.sanitizeName(dctxField.Name), tableName, "public")
// Map Clarion data types
dataType, length := r.mapDataType(dctxField.DataType, dctxField.Size)
column.Type = dataType
column.Length = length
// Check for auto-increment (identity)
for _, option := range dctxField.Options {
if option.Property == "IsIdentity" && option.PropertyValue == "1" {
column.AutoIncrement = true
column.NotNull = true
}
}
columns = append(columns, column)
return columns, nil
}
// mapDataType maps Clarion data types to SQL types
func (r *Reader) mapDataType(clarionType string, size int) (string, int) {
switch strings.ToUpper(clarionType) {
case "LONG":
if size == 8 {
return "bigint", 0
}
return "integer", 0
case "ULONG":
if size == 8 {
return "bigint", 0
}
return "integer", 0
case "SHORT":
return "smallint", 0
case "USHORT":
return "smallint", 0
case "BYTE":
return "smallint", 0
case "STRING":
if size > 0 {
return "varchar", size
}
return "text", 0
case "CSTRING":
if size > 0 {
// CSTRING includes null terminator, so subtract 1
length := size - 1
if length <= 0 {
length = 1
}
return "varchar", length
}
return "text", 0
case "PSTRING":
if size > 0 {
return "varchar", size
}
return "text", 0
case "DECIMAL":
return "decimal", 0
case "REAL":
return "real", 0
case "SREAL":
return "double precision", 0
case "DATE":
return "date", 0
case "TIME":
return "time", 0
case "BLOB":
return "bytea", 0
case "MEMO":
return "text", 0
case "BOOL", "BOOLEAN":
return "boolean", 0
default:
return "text", 0
}
}
// processKeys processes DCTX keys and converts them to indexes and primary keys
func (r *Reader) processKeys(dctxTable *DCTXTable, table *models.Table, fieldGuidMap map[string]string) error {
for _, dctxKey := range dctxTable.Keys {
err := r.convertKey(&dctxKey, table, fieldGuidMap)
if err != nil {
return fmt.Errorf("failed to convert key %s: %w", dctxKey.Name, err)
}
}
return nil
}
// convertKey converts a DCTX key to appropriate constraint/index
func (r *Reader) convertKey(dctxKey *DCTXKey, table *models.Table, fieldGuidMap map[string]string) error {
var columns []string
// Extract column names from key components
if len(dctxKey.Components) > 0 {
for _, component := range dctxKey.Components {
if fieldName, exists := fieldGuidMap[component.FieldId]; exists {
columns = append(columns, fieldName)
}
}
}
// If no columns found, try to infer
if len(columns) == 0 {
if dctxKey.Primary {
// Look for common primary key column patterns
for colName := range table.Columns {
colNameLower := strings.ToLower(colName)
if strings.HasPrefix(colNameLower, "rid_") || strings.HasSuffix(colNameLower, "id") {
columns = append(columns, colName)
break
}
}
}
// If still no columns, skip
if len(columns) == 0 {
return nil
}
}
// Handle primary key
if dctxKey.Primary {
// Create primary key constraint
constraint := models.InitConstraint(r.sanitizeName(dctxKey.Name), models.PrimaryKeyConstraint)
constraint.Table = table.Name
constraint.Schema = table.Schema
constraint.Columns = columns
table.Constraints[constraint.Name] = constraint
// Mark columns as NOT NULL
for _, colName := range columns {
if col, exists := table.Columns[colName]; exists {
col.NotNull = true
col.IsPrimaryKey = true
}
}
return nil
}
// Handle regular index
index := models.InitIndex(r.sanitizeName(dctxKey.Name))
index.Table = table.Name
index.Schema = table.Schema
index.Columns = columns
index.Unique = dctxKey.Unique
index.Type = "btree"
table.Indexes[index.Name] = index
return nil
}
// processRelations processes DCTX relations and creates foreign keys
func (r *Reader) processRelations(dctx *DCTXDictionary, schema *models.Schema, tableGuidMap map[string]string, keyGuidMap map[string]*DCTXKey, fieldGuidMaps map[string]map[string]string) error {
for _, relation := range dctx.Relations {
// Get table names from GUIDs
primaryTableName := tableGuidMap[relation.PrimaryTable]
foreignTableName := tableGuidMap[relation.ForeignTable]
if primaryTableName == "" || foreignTableName == "" {
continue
}
// Find tables
var primaryTable, foreignTable *models.Table
for _, table := range schema.Tables {
if table.Name == primaryTableName {
primaryTable = table
}
if table.Name == foreignTableName {
foreignTable = table
}
}
if primaryTable == nil || foreignTable == nil {
continue
}
var fkColumns, pkColumns []string
// Try to use explicit field mappings
if len(relation.ForeignMappings) > 0 && len(relation.PrimaryMappings) > 0 {
foreignFieldMap := fieldGuidMaps[foreignTableName]
primaryFieldMap := fieldGuidMaps[primaryTableName]
for _, mapping := range relation.ForeignMappings {
if fieldName, exists := foreignFieldMap[mapping.Field]; exists {
fkColumns = append(fkColumns, fieldName)
}
}
for _, mapping := range relation.PrimaryMappings {
if fieldName, exists := primaryFieldMap[mapping.Field]; exists {
pkColumns = append(pkColumns, fieldName)
}
}
}
// Validate columns exist
if len(fkColumns) == 0 || len(pkColumns) == 0 {
continue
}
allFkColumnsExist := true
for _, colName := range fkColumns {
if _, exists := foreignTable.Columns[colName]; !exists {
allFkColumnsExist = false
break
}
}
if !allFkColumnsExist {
continue
}
allPkColumnsExist := true
for _, colName := range pkColumns {
if _, exists := primaryTable.Columns[colName]; !exists {
allPkColumnsExist = false
break
}
}
if !allPkColumnsExist {
continue
}
// Create foreign key
fkName := r.sanitizeName(fmt.Sprintf("fk_%s_%s", foreignTableName, primaryTableName))
constraint := models.InitConstraint(fkName, models.ForeignKeyConstraint)
constraint.Table = foreignTableName
constraint.Schema = "public"
constraint.Columns = fkColumns
constraint.ReferencedTable = primaryTableName
constraint.ReferencedSchema = "public"
constraint.ReferencedColumns = pkColumns
constraint.OnDelete = r.mapReferentialAction(relation.Delete)
constraint.OnUpdate = r.mapReferentialAction(relation.Update)
foreignTable.Constraints[fkName] = constraint
// Create relationship
relationshipName := fmt.Sprintf("%s_to_%s", foreignTableName, primaryTableName)
relationship := models.InitRelationship(relationshipName, models.OneToMany)
relationship.FromTable = primaryTableName
relationship.FromSchema = "public"
relationship.ToTable = foreignTableName
relationship.ToSchema = "public"
relationship.ForeignKey = fkName
relationship.Properties["on_delete"] = constraint.OnDelete
relationship.Properties["on_update"] = constraint.OnUpdate
foreignTable.Relationships[relationshipName] = relationship
}
return nil
}
// mapReferentialAction maps DCTX referential actions to SQL syntax
func (r *Reader) mapReferentialAction(action string) string {
switch strings.ToUpper(action) {
case "RESTRICT", "RESTRICT_SERVER":
return "RESTRICT"
case "CASCADE", "CASCADE_SERVER":
return "CASCADE"
case "SET_NULL", "SET_NULL_SERVER":
return "SET NULL"
case "SET_DEFAULT", "SET_DEFAULT_SERVER":
return "SET DEFAULT"
case "NO_ACTION", "NO_ACTION_SERVER":
return "NO ACTION"
default:
return "RESTRICT"
}
}
// sanitizeName sanitizes a name to lowercase
func (r *Reader) sanitizeName(name string) string {
return strings.ToLower(name)
}

84
pkg/readers/dctx/types.go Normal file
View File

@@ -0,0 +1,84 @@
package dctx
import "encoding/xml"
// DCTXDictionary represents the root element of a DCTX file
type DCTXDictionary struct {
XMLName xml.Name `xml:"Dictionary"`
Name string `xml:"Name,attr"`
Version string `xml:"Version,attr"`
Tables []DCTXTable `xml:"Table"`
Relations []DCTXRelation `xml:"Relation"`
}
// DCTXTable represents a table definition in DCTX
type DCTXTable struct {
Guid string `xml:"Guid,attr"`
Name string `xml:"Name,attr"`
Prefix string `xml:"Prefix,attr"`
Driver string `xml:"Driver,attr"`
Owner string `xml:"Owner,attr"`
Path string `xml:"Path,attr"`
Description string `xml:"Description,attr"`
Fields []DCTXField `xml:"Field"`
Keys []DCTXKey `xml:"Key"`
Options []DCTXOption `xml:"Option"`
}
// DCTXField represents a field/column definition in DCTX
type DCTXField struct {
Guid string `xml:"Guid,attr"`
Name string `xml:"Name,attr"`
DataType string `xml:"DataType,attr"`
Size int `xml:"Size,attr"`
NoPopulate bool `xml:"NoPopulate,attr"`
Thread bool `xml:"Thread,attr"`
Fields []DCTXField `xml:"Field"` // For GROUP fields (nested structures)
Options []DCTXOption `xml:"Option"`
}
// DCTXKey represents an index or key definition in DCTX
type DCTXKey struct {
Guid string `xml:"Guid,attr"`
Name string `xml:"Name,attr"`
KeyType string `xml:"KeyType,attr"`
Primary bool `xml:"Primary,attr"`
Unique bool `xml:"Unique,attr"`
Order int `xml:"Order,attr"`
Description string `xml:"Description,attr"`
Components []DCTXComponent `xml:"Component"`
}
// DCTXComponent represents a component of a key (field reference)
type DCTXComponent struct {
Guid string `xml:"Guid,attr"`
FieldId string `xml:"FieldId,attr"`
Order int `xml:"Order,attr"`
Ascend bool `xml:"Ascend,attr"`
}
// DCTXOption represents a property option in DCTX
type DCTXOption struct {
Property string `xml:"Property,attr"`
PropertyType string `xml:"PropertyType,attr"`
PropertyValue string `xml:"PropertyValue,attr"`
}
// DCTXRelation represents a relationship/foreign key in DCTX
type DCTXRelation struct {
Guid string `xml:"Guid,attr"`
PrimaryTable string `xml:"PrimaryTable,attr"`
ForeignTable string `xml:"ForeignTable,attr"`
PrimaryKey string `xml:"PrimaryKey,attr"`
ForeignKey string `xml:"ForeignKey,attr"`
Delete string `xml:"Delete,attr"`
Update string `xml:"Update,attr"`
ForeignMappings []DCTXFieldMapping `xml:"ForeignMapping"`
PrimaryMappings []DCTXFieldMapping `xml:"PrimaryMapping"`
}
// DCTXFieldMapping represents a field mapping in a relation
type DCTXFieldMapping struct {
Guid string `xml:"Guid,attr"`
Field string `xml:"Field,attr"`
}

View File

@@ -0,0 +1,304 @@
package drawdb
import (
"encoding/json"
"fmt"
"os"
"strconv"
"strings"
"git.warky.dev/wdevs/relspecgo/pkg/models"
"git.warky.dev/wdevs/relspecgo/pkg/readers"
"git.warky.dev/wdevs/relspecgo/pkg/writers/drawdb"
)
// Reader implements the readers.Reader interface for DrawDB JSON format
type Reader struct {
options *readers.ReaderOptions
}
// NewReader creates a new DrawDB reader with the given options
func NewReader(options *readers.ReaderOptions) *Reader {
return &Reader{
options: options,
}
}
// ReadDatabase reads and parses DrawDB JSON input, returning a Database model
func (r *Reader) ReadDatabase() (*models.Database, error) {
if r.options.FilePath == "" {
return nil, fmt.Errorf("file path is required for DrawDB reader")
}
data, err := os.ReadFile(r.options.FilePath)
if err != nil {
return nil, fmt.Errorf("failed to read file: %w", err)
}
var drawSchema drawdb.DrawDBSchema
if err := json.Unmarshal(data, &drawSchema); err != nil {
return nil, fmt.Errorf("failed to parse DrawDB JSON: %w", err)
}
return r.convertToDatabase(&drawSchema)
}
// ReadSchema reads and parses DrawDB JSON input, returning a Schema model
func (r *Reader) ReadSchema() (*models.Schema, error) {
if r.options.FilePath == "" {
return nil, fmt.Errorf("file path is required for DrawDB reader")
}
data, err := os.ReadFile(r.options.FilePath)
if err != nil {
return nil, fmt.Errorf("failed to read file: %w", err)
}
var drawSchema drawdb.DrawDBSchema
if err := json.Unmarshal(data, &drawSchema); err != nil {
return nil, fmt.Errorf("failed to parse DrawDB JSON: %w", err)
}
return r.convertToSchema(&drawSchema, "default")
}
// ReadTable reads and parses DrawDB JSON input, returning a Table model
func (r *Reader) ReadTable() (*models.Table, error) {
if r.options.FilePath == "" {
return nil, fmt.Errorf("file path is required for DrawDB reader")
}
data, err := os.ReadFile(r.options.FilePath)
if err != nil {
return nil, fmt.Errorf("failed to read file: %w", err)
}
var drawSchema drawdb.DrawDBSchema
if err := json.Unmarshal(data, &drawSchema); err != nil {
return nil, fmt.Errorf("failed to parse DrawDB JSON: %w", err)
}
if len(drawSchema.Tables) == 0 {
return nil, fmt.Errorf("no tables found in DrawDB JSON")
}
// Return the first table
return r.convertToTable(drawSchema.Tables[0], &drawSchema)
}
// convertToDatabase converts a DrawDB schema to a Database model
func (r *Reader) convertToDatabase(drawSchema *drawdb.DrawDBSchema) (*models.Database, error) {
db := models.InitDatabase("database")
if r.options.Metadata != nil {
if name, ok := r.options.Metadata["name"].(string); ok {
db.Name = name
}
}
// Extract database info from notes
for _, note := range drawSchema.Notes {
if strings.HasPrefix(note.Content, "Database:") {
parts := strings.SplitN(note.Content, "\n\n", 2)
if len(parts) == 2 {
db.Description = parts[1]
}
}
}
// Group tables by schema
schemaMap := make(map[string]*models.Schema)
for _, drawTable := range drawSchema.Tables {
schemaName := drawTable.Schema
if schemaName == "" {
schemaName = "public"
}
schema, exists := schemaMap[schemaName]
if !exists {
schema = models.InitSchema(schemaName)
schemaMap[schemaName] = schema
}
table, err := r.convertToTable(drawTable, drawSchema)
if err != nil {
return nil, fmt.Errorf("failed to convert table %s: %w", drawTable.Name, err)
}
schema.Tables = append(schema.Tables, table)
}
// Add schemas to database
for _, schema := range schemaMap {
db.Schemas = append(db.Schemas, schema)
}
return db, nil
}
// convertToSchema converts DrawDB tables to a Schema model
func (r *Reader) convertToSchema(drawSchema *drawdb.DrawDBSchema, schemaName string) (*models.Schema, error) {
schema := models.InitSchema(schemaName)
for _, drawTable := range drawSchema.Tables {
// Filter by schema if specified in the table
if drawTable.Schema != "" && drawTable.Schema != schemaName {
continue
}
table, err := r.convertToTable(drawTable, drawSchema)
if err != nil {
return nil, fmt.Errorf("failed to convert table %s: %w", drawTable.Name, err)
}
schema.Tables = append(schema.Tables, table)
}
return schema, nil
}
// convertToTable converts a DrawDB table to a Table model
func (r *Reader) convertToTable(drawTable *drawdb.DrawDBTable, drawSchema *drawdb.DrawDBSchema) (*models.Table, error) {
schemaName := drawTable.Schema
if schemaName == "" {
schemaName = "public"
}
table := models.InitTable(drawTable.Name, schemaName)
table.Description = drawTable.Comment
// Convert fields to columns
for _, field := range drawTable.Fields {
column := r.convertToColumn(field, drawTable.Name, schemaName)
table.Columns[column.Name] = column
}
// Convert indexes
for _, index := range drawTable.Indexes {
idx := r.convertToIndex(index, drawTable, schemaName)
table.Indexes[idx.Name] = idx
}
// Find and convert relationships/constraints for this table
for _, rel := range drawSchema.Relationships {
if rel.StartTableID == drawTable.ID {
constraint := r.convertToConstraint(rel, drawSchema)
if constraint != nil {
table.Constraints[constraint.Name] = constraint
}
}
}
return table, nil
}
// convertToColumn converts a DrawDB field to a Column model
func (r *Reader) convertToColumn(field *drawdb.DrawDBField, tableName, schemaName string) *models.Column {
column := models.InitColumn(field.Name, tableName, schemaName)
// Parse type and dimensions
typeStr := field.Type
column.Type = typeStr
// Try to extract length/precision from type string like "varchar(255)" or "decimal(10,2)"
if strings.Contains(typeStr, "(") {
parts := strings.Split(typeStr, "(")
column.Type = parts[0]
if len(parts) > 1 {
dimensions := strings.TrimSuffix(parts[1], ")")
if strings.Contains(dimensions, ",") {
// Precision and scale (e.g., decimal(10,2))
dims := strings.Split(dimensions, ",")
if precision, err := strconv.Atoi(strings.TrimSpace(dims[0])); err == nil {
column.Precision = precision
}
if len(dims) > 1 {
if scale, err := strconv.Atoi(strings.TrimSpace(dims[1])); err == nil {
column.Scale = scale
}
}
} else {
// Just length (e.g., varchar(255))
if length, err := strconv.Atoi(dimensions); err == nil {
column.Length = length
}
}
}
}
column.IsPrimaryKey = field.Primary
column.NotNull = field.NotNull || field.Primary
column.AutoIncrement = field.Increment
column.Comment = field.Comment
if field.Default != "" {
column.Default = field.Default
}
return column
}
// convertToIndex converts a DrawDB index to an Index model
func (r *Reader) convertToIndex(drawIndex *drawdb.DrawDBIndex, drawTable *drawdb.DrawDBTable, schemaName string) *models.Index {
index := models.InitIndex(drawIndex.Name)
index.Table = drawTable.Name
index.Schema = schemaName
index.Unique = drawIndex.Unique
// Convert field IDs to column names
for _, fieldID := range drawIndex.Fields {
if fieldID >= 0 && fieldID < len(drawTable.Fields) {
index.Columns = append(index.Columns, drawTable.Fields[fieldID].Name)
}
}
return index
}
// convertToConstraint converts a DrawDB relationship to a Constraint model
func (r *Reader) convertToConstraint(rel *drawdb.DrawDBRelationship, drawSchema *drawdb.DrawDBSchema) *models.Constraint {
// Find the start and end tables
var startTable, endTable *drawdb.DrawDBTable
for _, table := range drawSchema.Tables {
if table.ID == rel.StartTableID {
startTable = table
}
if table.ID == rel.EndTableID {
endTable = table
}
}
if startTable == nil || endTable == nil {
return nil
}
constraint := models.InitConstraint(rel.Name, models.ForeignKeyConstraint)
// Get the column names from field IDs
if rel.StartFieldID >= 0 && rel.StartFieldID < len(startTable.Fields) {
constraint.Columns = append(constraint.Columns, startTable.Fields[rel.StartFieldID].Name)
}
if rel.EndFieldID >= 0 && rel.EndFieldID < len(endTable.Fields) {
constraint.ReferencedColumns = append(constraint.ReferencedColumns, endTable.Fields[rel.EndFieldID].Name)
}
constraint.Table = startTable.Name
constraint.Schema = startTable.Schema
if constraint.Schema == "" {
constraint.Schema = "public"
}
constraint.ReferencedTable = endTable.Name
constraint.ReferencedSchema = endTable.Schema
if constraint.ReferencedSchema == "" {
constraint.ReferencedSchema = "public"
}
constraint.OnUpdate = rel.UpdateConstraint
constraint.OnDelete = rel.DeleteConstraint
return constraint
}

View File

@@ -5,10 +5,16 @@ import (
)
// Reader defines the interface for reading database specifications
// from various input formats
// from various input formats at different granularity levels
type Reader interface {
// Read reads and parses the input, returning a Database model
Read() (*models.Database, error)
// ReadDatabase reads and parses the input, returning a Database model
ReadDatabase() (*models.Database, error)
// ReadSchema reads and parses the input, returning a Schema model
ReadSchema() (*models.Schema, error)
// ReadTable reads and parses the input, returning a Table model
ReadTable() (*models.Table, error)
}
// ReaderOptions contains common options for readers

View File

@@ -12,8 +12,8 @@ func NewTransformer() *Transformer {
return &Transformer{}
}
// Validate validates a database model for correctness
func (t *Transformer) Validate(db *models.Database) error {
// ValidateDatabase validates a database model for correctness
func (t *Transformer) ValidateDatabase(db *models.Database) error {
// TODO: Implement validation logic
// - Check for duplicate table names
// - Validate column types
@@ -22,11 +22,45 @@ func (t *Transformer) Validate(db *models.Database) error {
return nil
}
// Normalize normalizes a database model to a standard format
func (t *Transformer) Normalize(db *models.Database) (*models.Database, error) {
// ValidateSchema validates a schema model for correctness
func (t *Transformer) ValidateSchema(schema *models.Schema) error {
// TODO: Implement validation logic
// - Check for duplicate table names within schema
// - Validate table references
return nil
}
// ValidateTable validates a table model for correctness
func (t *Transformer) ValidateTable(table *models.Table) error {
// TODO: Implement validation logic
// - Validate column types
// - Ensure constraints reference existing columns
// - Validate relation integrity
return nil
}
// NormalizeDatabase normalizes a database model to a standard format
func (t *Transformer) NormalizeDatabase(db *models.Database) (*models.Database, error) {
// TODO: Implement normalization logic
// - Standardize naming conventions
// - Order tables/columns consistently
// - Apply default values
return db, nil
}
// NormalizeSchema normalizes a schema model to a standard format
func (t *Transformer) NormalizeSchema(schema *models.Schema) (*models.Schema, error) {
// TODO: Implement normalization logic
// - Standardize naming conventions
// - Order tables/columns consistently
return schema, nil
}
// NormalizeTable normalizes a table model to a standard format
func (t *Transformer) NormalizeTable(table *models.Table) (*models.Table, error) {
// TODO: Implement normalization logic
// - Standardize naming conventions
// - Order columns consistently
// - Apply default values
return table, nil
}

View File

@@ -0,0 +1,284 @@
package bun
import (
"strings"
"unicode"
)
// SnakeCaseToPascalCase converts snake_case to PascalCase
// Examples: user_id → UserID, http_request → HTTPRequest
func SnakeCaseToPascalCase(s string) string {
if s == "" {
return ""
}
parts := strings.Split(s, "_")
for i, part := range parts {
parts[i] = capitalize(part)
}
return strings.Join(parts, "")
}
// SnakeCaseToCamelCase converts snake_case to camelCase
// Examples: user_id → userID, http_request → httpRequest
func SnakeCaseToCamelCase(s string) string {
if s == "" {
return ""
}
parts := strings.Split(s, "_")
for i, part := range parts {
if i == 0 {
parts[i] = strings.ToLower(part)
} else {
parts[i] = capitalize(part)
}
}
return strings.Join(parts, "")
}
// PascalCaseToSnakeCase converts PascalCase to snake_case
// Examples: UserID → user_id, HTTPRequest → http_request
func PascalCaseToSnakeCase(s string) string {
if s == "" {
return ""
}
var result strings.Builder
var prevUpper bool
var nextUpper bool
runes := []rune(s)
for i, r := range runes {
isUpper := unicode.IsUpper(r)
if i+1 < len(runes) {
nextUpper = unicode.IsUpper(runes[i+1])
} else {
nextUpper = false
}
if i > 0 && isUpper {
// Add underscore before uppercase letter if:
// 1. Previous char was lowercase, OR
// 2. Next char is lowercase (end of acronym)
if !prevUpper || (nextUpper == false && i+1 < len(runes)) {
result.WriteRune('_')
}
}
result.WriteRune(unicode.ToLower(r))
prevUpper = isUpper
}
return result.String()
}
// capitalize capitalizes the first letter and handles common acronyms
func capitalize(s string) string {
if s == "" {
return ""
}
upper := strings.ToUpper(s)
// Handle common acronyms
acronyms := map[string]bool{
"ID": true,
"UUID": true,
"GUID": true,
"URL": true,
"URI": true,
"HTTP": true,
"HTTPS": true,
"API": true,
"JSON": true,
"XML": true,
"SQL": true,
"HTML": true,
"CSS": true,
"RID": true,
}
if acronyms[upper] {
return upper
}
// Capitalize first letter
runes := []rune(s)
runes[0] = unicode.ToUpper(runes[0])
return string(runes)
}
// Pluralize converts a singular word to plural
// Basic implementation with common rules
func Pluralize(s string) string {
if s == "" {
return ""
}
// Special cases
irregular := map[string]string{
"person": "people",
"child": "children",
"tooth": "teeth",
"foot": "feet",
"man": "men",
"woman": "women",
"mouse": "mice",
"goose": "geese",
"ox": "oxen",
"datum": "data",
"medium": "media",
"analysis": "analyses",
"crisis": "crises",
"status": "statuses",
}
if plural, ok := irregular[strings.ToLower(s)]; ok {
return plural
}
// Already plural (ends in 's' but not 'ss' or 'us')
if strings.HasSuffix(s, "s") && !strings.HasSuffix(s, "ss") && !strings.HasSuffix(s, "us") {
return s
}
// Words ending in s, x, z, ch, sh
if strings.HasSuffix(s, "s") || strings.HasSuffix(s, "x") ||
strings.HasSuffix(s, "z") || strings.HasSuffix(s, "ch") ||
strings.HasSuffix(s, "sh") {
return s + "es"
}
// Words ending in consonant + y
if len(s) >= 2 && strings.HasSuffix(s, "y") {
prevChar := s[len(s)-2]
if !isVowel(prevChar) {
return s[:len(s)-1] + "ies"
}
}
// Words ending in f or fe
if strings.HasSuffix(s, "f") {
return s[:len(s)-1] + "ves"
}
if strings.HasSuffix(s, "fe") {
return s[:len(s)-2] + "ves"
}
// Words ending in consonant + o
if len(s) >= 2 && strings.HasSuffix(s, "o") {
prevChar := s[len(s)-2]
if !isVowel(prevChar) {
return s + "es"
}
}
// Default: add 's'
return s + "s"
}
// Singularize converts a plural word to singular
// Basic implementation with common rules
func Singularize(s string) string {
if s == "" {
return ""
}
// Special cases
irregular := map[string]string{
"people": "person",
"children": "child",
"teeth": "tooth",
"feet": "foot",
"men": "man",
"women": "woman",
"mice": "mouse",
"geese": "goose",
"oxen": "ox",
"data": "datum",
"media": "medium",
"analyses": "analysis",
"crises": "crisis",
"statuses": "status",
}
if singular, ok := irregular[strings.ToLower(s)]; ok {
return singular
}
// Words ending in ies
if strings.HasSuffix(s, "ies") && len(s) > 3 {
return s[:len(s)-3] + "y"
}
// Words ending in ves
if strings.HasSuffix(s, "ves") {
return s[:len(s)-3] + "f"
}
// Words ending in ses, xes, zes, ches, shes
if strings.HasSuffix(s, "ses") || strings.HasSuffix(s, "xes") ||
strings.HasSuffix(s, "zes") || strings.HasSuffix(s, "ches") ||
strings.HasSuffix(s, "shes") {
return s[:len(s)-2]
}
// Words ending in s (not ss)
if strings.HasSuffix(s, "s") && !strings.HasSuffix(s, "ss") {
return s[:len(s)-1]
}
// Already singular
return s
}
// GeneratePrefix generates a 3-letter prefix from a table name
// Examples: process → PRO, mastertask → MTL, user → USR
func GeneratePrefix(tableName string) string {
if tableName == "" {
return "TBL"
}
// Remove common prefixes
tableName = strings.TrimPrefix(tableName, "tbl_")
tableName = strings.TrimPrefix(tableName, "tb_")
// Split by underscore and take first letters
parts := strings.Split(tableName, "_")
var prefix strings.Builder
for _, part := range parts {
if part == "" {
continue
}
prefix.WriteRune(unicode.ToUpper(rune(part[0])))
if prefix.Len() >= 3 {
break
}
}
result := prefix.String()
// If we don't have 3 letters yet, add more from the first part
if len(result) < 3 && len(parts) > 0 {
firstPart := parts[0]
for i := 1; i < len(firstPart) && len(result) < 3; i++ {
result += strings.ToUpper(string(firstPart[i]))
}
}
// Pad with 'X' if still too short
for len(result) < 3 {
result += "X"
}
return result[:3]
}
// isVowel checks if a byte is a vowel
func isVowel(c byte) bool {
c = byte(unicode.ToLower(rune(c)))
return c == 'a' || c == 'e' || c == 'i' || c == 'o' || c == 'u'
}

View File

@@ -0,0 +1,250 @@
package bun
import (
"sort"
"git.warky.dev/wdevs/relspecgo/pkg/models"
)
// TemplateData represents the data passed to the template for code generation
type TemplateData struct {
PackageName string
Imports []string
Models []*ModelData
Config *MethodConfig
}
// ModelData represents a single model/struct in the template
type ModelData struct {
Name string
TableName string // schema.table format
SchemaName string
TableNameOnly string // just table name without schema
Comment string
Fields []*FieldData
Config *MethodConfig
PrimaryKeyField string // Name of the primary key field
IDColumnName string // Name of the ID column in database
Prefix string // 3-letter prefix
}
// FieldData represents a single field in a struct
type FieldData struct {
Name string // Go field name (PascalCase)
Type string // Go type
GormTag string // Complete gorm tag
JSONTag string // JSON tag
Comment string // Field comment
}
// MethodConfig controls which helper methods to generate
type MethodConfig struct {
GenerateTableName bool
GenerateSchemaName bool
GenerateTableNameOnly bool
GenerateGetID bool
GenerateGetIDStr bool
GenerateSetID bool
GenerateUpdateID bool
GenerateGetIDName bool
GenerateGetPrefix bool
}
// DefaultMethodConfig returns a MethodConfig with all methods enabled
func DefaultMethodConfig() *MethodConfig {
return &MethodConfig{
GenerateTableName: true,
GenerateSchemaName: true,
GenerateTableNameOnly: true,
GenerateGetID: true,
GenerateGetIDStr: true,
GenerateSetID: true,
GenerateUpdateID: true,
GenerateGetIDName: true,
GenerateGetPrefix: true,
}
}
// NewTemplateData creates a new TemplateData with the given package name and config
func NewTemplateData(packageName string, config *MethodConfig) *TemplateData {
if config == nil {
config = DefaultMethodConfig()
}
return &TemplateData{
PackageName: packageName,
Imports: make([]string, 0),
Models: make([]*ModelData, 0),
Config: config,
}
}
// AddModel adds a model to the template data
func (td *TemplateData) AddModel(model *ModelData) {
model.Config = td.Config
td.Models = append(td.Models, model)
}
// AddImport adds an import to the template data (deduplicates automatically)
func (td *TemplateData) AddImport(importPath string) {
// Check if already exists
for _, imp := range td.Imports {
if imp == importPath {
return
}
}
td.Imports = append(td.Imports, importPath)
}
// FinalizeImports sorts and organizes imports
func (td *TemplateData) FinalizeImports() {
// Sort imports alphabetically
sort.Strings(td.Imports)
}
// NewModelData creates a new ModelData from a models.Table
func NewModelData(table *models.Table, schema string, typeMapper *TypeMapper) *ModelData {
tableName := table.Name
if schema != "" {
tableName = schema + "." + table.Name
}
// Generate model name: singularize and convert to PascalCase
singularTable := Singularize(table.Name)
modelName := SnakeCaseToPascalCase(singularTable)
// Add "Model" prefix if not already present
if !hasModelPrefix(modelName) {
modelName = "Model" + modelName
}
model := &ModelData{
Name: modelName,
TableName: tableName,
SchemaName: schema,
TableNameOnly: table.Name,
Comment: formatComment(table.Description, table.Comment),
Fields: make([]*FieldData, 0),
Prefix: GeneratePrefix(table.Name),
}
// Find primary key
for _, col := range table.Columns {
if col.IsPrimaryKey {
model.PrimaryKeyField = SnakeCaseToPascalCase(col.Name)
model.IDColumnName = col.Name
break
}
}
// Convert columns to fields (sorted by sequence or name)
columns := sortColumns(table.Columns)
for _, col := range columns {
field := columnToField(col, table, typeMapper)
model.Fields = append(model.Fields, field)
}
return model
}
// columnToField converts a models.Column to FieldData
func columnToField(col *models.Column, table *models.Table, typeMapper *TypeMapper) *FieldData {
fieldName := SnakeCaseToPascalCase(col.Name)
goType := typeMapper.SQLTypeToGoType(col.Type, col.NotNull)
gormTag := typeMapper.BuildGormTag(col, table)
jsonTag := col.Name // Use column name for JSON tag
return &FieldData{
Name: fieldName,
Type: goType,
GormTag: gormTag,
JSONTag: jsonTag,
Comment: formatComment(col.Description, col.Comment),
}
}
// AddRelationshipField adds a relationship field to the model
func (md *ModelData) AddRelationshipField(field *FieldData) {
md.Fields = append(md.Fields, field)
}
// formatComment combines description and comment into a single comment string
func formatComment(description, comment string) string {
if description != "" && comment != "" {
return description + " - " + comment
}
if description != "" {
return description
}
return comment
}
// hasModelPrefix checks if a name already has "Model" prefix
func hasModelPrefix(name string) bool {
return len(name) >= 5 && name[:5] == "Model"
}
// sortColumns sorts columns by sequence, then by name
func sortColumns(columns map[string]*models.Column) []*models.Column {
result := make([]*models.Column, 0, len(columns))
for _, col := range columns {
result = append(result, col)
}
sort.Slice(result, func(i, j int) bool {
// Sort by sequence if both have it
if result[i].Sequence > 0 && result[j].Sequence > 0 {
return result[i].Sequence < result[j].Sequence
}
// Put primary keys first
if result[i].IsPrimaryKey != result[j].IsPrimaryKey {
return result[i].IsPrimaryKey
}
// Otherwise sort alphabetically
return result[i].Name < result[j].Name
})
return result
}
// LoadMethodConfigFromMetadata loads method configuration from metadata map
func LoadMethodConfigFromMetadata(metadata map[string]interface{}) *MethodConfig {
config := DefaultMethodConfig()
if metadata == nil {
return config
}
// Load each setting from metadata if present
if val, ok := metadata["generate_table_name"].(bool); ok {
config.GenerateTableName = val
}
if val, ok := metadata["generate_schema_name"].(bool); ok {
config.GenerateSchemaName = val
}
if val, ok := metadata["generate_table_name_only"].(bool); ok {
config.GenerateTableNameOnly = val
}
if val, ok := metadata["generate_get_id"].(bool); ok {
config.GenerateGetID = val
}
if val, ok := metadata["generate_get_id_str"].(bool); ok {
config.GenerateGetIDStr = val
}
if val, ok := metadata["generate_set_id"].(bool); ok {
config.GenerateSetID = val
}
if val, ok := metadata["generate_update_id"].(bool); ok {
config.GenerateUpdateID = val
}
if val, ok := metadata["generate_get_id_name"].(bool); ok {
config.GenerateGetIDName = val
}
if val, ok := metadata["generate_get_prefix"].(bool); ok {
config.GenerateGetPrefix = val
}
return config
}

View File

@@ -0,0 +1,118 @@
package bun
import (
"bytes"
"text/template"
)
// modelTemplate defines the template for generating Bun models
const modelTemplate = `// Code generated by relspecgo. DO NOT EDIT.
package {{.PackageName}}
{{if .Imports -}}
import (
{{range .Imports -}}
{{.}}
{{end -}}
)
{{end}}
{{range .Models}}
{{if .Comment}}// {{.Comment}}{{end}}
type {{.Name}} struct {
bun.BaseModel ` + "`bun:\"table:{{.TableName}},alias:{{.TableNameOnly}}\"`" + `
{{- range .Fields}}
{{.Name}} {{.Type}} ` + "`bun:\"{{.BunTag}}\" json:\"{{.JSONTag}}\"`" + `{{if .Comment}} // {{.Comment}}{{end}}
{{- end}}
}
{{if .Config.GenerateTableName}}
// TableName returns the table name for {{.Name}}
func (m {{.Name}}) TableName() string {
return "{{.TableName}}"
}
{{end}}
{{if .Config.GenerateTableNameOnly}}
// TableNameOnly returns the table name without schema for {{.Name}}
func (m {{.Name}}) TableNameOnly() string {
return "{{.TableNameOnly}}"
}
{{end}}
{{if .Config.GenerateSchemaName}}
// SchemaName returns the schema name for {{.Name}}
func (m {{.Name}}) SchemaName() string {
return "{{.SchemaName}}"
}
{{end}}
{{if and .Config.GenerateGetID .PrimaryKeyField}}
// GetID returns the primary key value
func (m {{.Name}}) GetID() int64 {
{{if .PrimaryKeyIsSQL -}}
return m.{{.PrimaryKeyField}}.Int64()
{{- else -}}
return int64(m.{{.PrimaryKeyField}})
{{- end}}
}
{{end}}
{{if and .Config.GenerateGetIDStr .PrimaryKeyField}}
// GetIDStr returns the primary key as a string
func (m {{.Name}}) GetIDStr() string {
return fmt.Sprintf("%d", m.{{.PrimaryKeyField}})
}
{{end}}
{{if and .Config.GenerateSetID .PrimaryKeyField}}
// SetID sets the primary key value
func (m {{.Name}}) SetID(newid int64) {
m.UpdateID(newid)
}
{{end}}
{{if and .Config.GenerateUpdateID .PrimaryKeyField}}
// UpdateID updates the primary key value
func (m *{{.Name}}) UpdateID(newid int64) {
{{if .PrimaryKeyIsSQL -}}
m.{{.PrimaryKeyField}}.FromString(fmt.Sprintf("%d", newid))
{{- else -}}
m.{{.PrimaryKeyField}} = int32(newid)
{{- end}}
}
{{end}}
{{if and .Config.GenerateGetIDName .IDColumnName}}
// GetIDName returns the name of the primary key column
func (m {{.Name}}) GetIDName() string {
return "{{.IDColumnName}}"
}
{{end}}
{{if .Config.GenerateGetPrefix}}
// GetPrefix returns the table prefix
func (m {{.Name}}) GetPrefix() string {
return "{{.Prefix}}"
}
{{end}}
{{end -}}
`
// Templates holds the parsed templates
type Templates struct {
modelTmpl *template.Template
}
// NewTemplates creates and parses the templates
func NewTemplates() (*Templates, error) {
modelTmpl, err := template.New("model").Parse(modelTemplate)
if err != nil {
return nil, err
}
return &Templates{
modelTmpl: modelTmpl,
}, nil
}
// GenerateCode executes the template with the given data
func (t *Templates) GenerateCode(data *TemplateData) (string, error) {
var buf bytes.Buffer
err := t.modelTmpl.Execute(&buf, data)
if err != nil {
return "", err
}
return buf.String(), nil
}

View File

@@ -0,0 +1,253 @@
package bun
import (
"fmt"
"strings"
"git.warky.dev/wdevs/relspecgo/pkg/models"
)
// TypeMapper handles type conversions between SQL and Go types for Bun
type TypeMapper struct {
// Package alias for sql_types import
sqlTypesAlias string
}
// NewTypeMapper creates a new TypeMapper with default settings
func NewTypeMapper() *TypeMapper {
return &TypeMapper{
sqlTypesAlias: "resolvespec_common",
}
}
// SQLTypeToGoType converts a SQL type to its Go equivalent
// Uses ResolveSpec common package types (all are nullable by default in Bun)
func (tm *TypeMapper) SQLTypeToGoType(sqlType string, notNull bool) string {
// Normalize SQL type (lowercase, remove length/precision)
baseType := tm.extractBaseType(sqlType)
// For Bun, we typically use resolvespec_common types for most fields
// unless they're explicitly NOT NULL and we want to avoid null handling
if notNull && tm.isSimpleType(baseType) {
return tm.baseGoType(baseType)
}
// Use resolvespec_common types for nullable fields
return tm.bunGoType(baseType)
}
// extractBaseType extracts the base type from a SQL type string
func (tm *TypeMapper) extractBaseType(sqlType string) string {
sqlType = strings.ToLower(strings.TrimSpace(sqlType))
// Remove everything after '('
if idx := strings.Index(sqlType, "("); idx > 0 {
sqlType = sqlType[:idx]
}
return sqlType
}
// isSimpleType checks if a type should use base Go type when NOT NULL
func (tm *TypeMapper) isSimpleType(sqlType string) bool {
simpleTypes := map[string]bool{
"bigint": true,
"integer": true,
"int8": true,
"int4": true,
"boolean": true,
"bool": true,
}
return simpleTypes[sqlType]
}
// baseGoType returns the base Go type for a SQL type (not null, simple types only)
func (tm *TypeMapper) baseGoType(sqlType string) string {
typeMap := map[string]string{
"integer": "int32",
"int": "int32",
"int4": "int32",
"smallint": "int16",
"int2": "int16",
"bigint": "int64",
"int8": "int64",
"serial": "int32",
"bigserial": "int64",
"boolean": "bool",
"bool": "bool",
}
if goType, ok := typeMap[sqlType]; ok {
return goType
}
// Default to resolvespec type
return tm.bunGoType(sqlType)
}
// bunGoType returns the Bun/ResolveSpec common type
func (tm *TypeMapper) bunGoType(sqlType string) string {
typeMap := map[string]string{
// Integer types
"integer": tm.sqlTypesAlias + ".SqlInt32",
"int": tm.sqlTypesAlias + ".SqlInt32",
"int4": tm.sqlTypesAlias + ".SqlInt32",
"smallint": tm.sqlTypesAlias + ".SqlInt16",
"int2": tm.sqlTypesAlias + ".SqlInt16",
"bigint": tm.sqlTypesAlias + ".SqlInt64",
"int8": tm.sqlTypesAlias + ".SqlInt64",
"serial": tm.sqlTypesAlias + ".SqlInt32",
"bigserial": tm.sqlTypesAlias + ".SqlInt64",
"smallserial": tm.sqlTypesAlias + ".SqlInt16",
// String types
"text": tm.sqlTypesAlias + ".SqlString",
"varchar": tm.sqlTypesAlias + ".SqlString",
"char": tm.sqlTypesAlias + ".SqlString",
"character": tm.sqlTypesAlias + ".SqlString",
"citext": tm.sqlTypesAlias + ".SqlString",
"bpchar": tm.sqlTypesAlias + ".SqlString",
// Boolean
"boolean": tm.sqlTypesAlias + ".SqlBool",
"bool": tm.sqlTypesAlias + ".SqlBool",
// Float types
"real": tm.sqlTypesAlias + ".SqlFloat32",
"float4": tm.sqlTypesAlias + ".SqlFloat32",
"double precision": tm.sqlTypesAlias + ".SqlFloat64",
"float8": tm.sqlTypesAlias + ".SqlFloat64",
"numeric": tm.sqlTypesAlias + ".SqlFloat64",
"decimal": tm.sqlTypesAlias + ".SqlFloat64",
// Date/Time types
"timestamp": tm.sqlTypesAlias + ".SqlTime",
"timestamp without time zone": tm.sqlTypesAlias + ".SqlTime",
"timestamp with time zone": tm.sqlTypesAlias + ".SqlTime",
"timestamptz": tm.sqlTypesAlias + ".SqlTime",
"date": tm.sqlTypesAlias + ".SqlDate",
"time": tm.sqlTypesAlias + ".SqlTime",
"time without time zone": tm.sqlTypesAlias + ".SqlTime",
"time with time zone": tm.sqlTypesAlias + ".SqlTime",
"timetz": tm.sqlTypesAlias + ".SqlTime",
// Binary
"bytea": "[]byte",
// UUID
"uuid": tm.sqlTypesAlias + ".SqlUUID",
// JSON
"json": tm.sqlTypesAlias + ".SqlJSON",
"jsonb": tm.sqlTypesAlias + ".SqlJSONB",
// Network
"inet": tm.sqlTypesAlias + ".SqlString",
"cidr": tm.sqlTypesAlias + ".SqlString",
"macaddr": tm.sqlTypesAlias + ".SqlString",
// Other
"money": tm.sqlTypesAlias + ".SqlFloat64",
}
if goType, ok := typeMap[sqlType]; ok {
return goType
}
// Default to SqlString for unknown types
return tm.sqlTypesAlias + ".SqlString"
}
// BuildBunTag generates a complete Bun tag string for a column
// Bun format: bun:"column_name,type:type_name,pk,default:value"
func (tm *TypeMapper) BuildBunTag(column *models.Column, table *models.Table) string {
var parts []string
// Column name comes first (no prefix)
parts = append(parts, column.Name)
// Add type if specified
if column.Type != "" {
typeStr := column.Type
if column.Length > 0 {
typeStr = fmt.Sprintf("%s(%d)", typeStr, column.Length)
} else if column.Precision > 0 {
if column.Scale > 0 {
typeStr = fmt.Sprintf("%s(%d,%d)", typeStr, column.Precision, column.Scale)
} else {
typeStr = fmt.Sprintf("%s(%d)", typeStr, column.Precision)
}
}
parts = append(parts, fmt.Sprintf("type:%s", typeStr))
}
// Primary key
if column.IsPrimaryKey {
parts = append(parts, "pk")
}
// Default value
if column.Default != nil {
parts = append(parts, fmt.Sprintf("default:%v", column.Default))
}
// Nullable (Bun uses nullzero for nullable fields)
if !column.NotNull && !column.IsPrimaryKey {
parts = append(parts, "nullzero")
}
// Check for unique constraint
if table != nil {
for _, constraint := range table.Constraints {
if constraint.Type == models.UniqueConstraint {
for _, col := range constraint.Columns {
if col == column.Name {
parts = append(parts, "unique")
break
}
}
}
}
}
// Join with commas and add trailing comma (Bun convention)
return strings.Join(parts, ",") + ","
}
// BuildRelationshipTag generates Bun tag for relationship fields
// Bun format: bun:"rel:has-one,join:local_column=foreign_column"
func (tm *TypeMapper) BuildRelationshipTag(constraint *models.Constraint, relType string) string {
var parts []string
// Add relationship type
parts = append(parts, fmt.Sprintf("rel:%s", relType))
// Add join clause
if len(constraint.Columns) > 0 && len(constraint.ReferencedColumns) > 0 {
localCol := constraint.Columns[0]
foreignCol := constraint.ReferencedColumns[0]
parts = append(parts, fmt.Sprintf("join:%s=%s", localCol, foreignCol))
}
return strings.Join(parts, ",")
}
// NeedsTimeImport checks if the Go type requires time package import
func (tm *TypeMapper) NeedsTimeImport(goType string) bool {
return strings.Contains(goType, "time.Time")
}
// NeedsFmtImport checks if we need fmt import (for GetIDStr method)
func (tm *TypeMapper) NeedsFmtImport(generateGetIDStr bool) bool {
return generateGetIDStr
}
// GetSQLTypesImport returns the import path for sql_types (ResolveSpec common)
func (tm *TypeMapper) GetSQLTypesImport() string {
return "github.com/bitechdev/ResolveSpec/pkg/common"
}
// GetBunImport returns the import path for Bun
func (tm *TypeMapper) GetBunImport() string {
return "github.com/uptrace/bun"
}

224
pkg/writers/dbml/writer.go Normal file
View File

@@ -0,0 +1,224 @@
package dbml
import (
"fmt"
"os"
"strings"
"git.warky.dev/wdevs/relspecgo/pkg/models"
"git.warky.dev/wdevs/relspecgo/pkg/writers"
)
// Writer implements the writers.Writer interface for DBML format
type Writer struct {
options *writers.WriterOptions
}
// NewWriter creates a new DBML writer with the given options
func NewWriter(options *writers.WriterOptions) *Writer {
return &Writer{
options: options,
}
}
// WriteDatabase writes a Database model to DBML format
func (w *Writer) WriteDatabase(db *models.Database) error {
content := w.databaseToDBML(db)
if w.options.OutputPath != "" {
return os.WriteFile(w.options.OutputPath, []byte(content), 0644)
}
// If no output path, print to stdout
fmt.Print(content)
return nil
}
// WriteSchema writes a Schema model to DBML format
func (w *Writer) WriteSchema(schema *models.Schema) error {
content := w.schemaToDBML(schema)
if w.options.OutputPath != "" {
return os.WriteFile(w.options.OutputPath, []byte(content), 0644)
}
fmt.Print(content)
return nil
}
// WriteTable writes a Table model to DBML format
func (w *Writer) WriteTable(table *models.Table) error {
content := w.tableToDBML(table, table.Schema)
if w.options.OutputPath != "" {
return os.WriteFile(w.options.OutputPath, []byte(content), 0644)
}
fmt.Print(content)
return nil
}
// databaseToDBML converts a Database to DBML format string
func (w *Writer) databaseToDBML(d *models.Database) string {
var result string
// Add database comment if exists
if d.Description != "" {
result += fmt.Sprintf("// %s\n", d.Description)
}
if d.Comment != "" {
result += fmt.Sprintf("// %s\n", d.Comment)
}
if d.Description != "" || d.Comment != "" {
result += "\n"
}
// Process each schema
for _, schema := range d.Schemas {
result += w.schemaToDBML(schema)
}
// Add relationships
result += "\n// Relationships\n"
for _, schema := range d.Schemas {
for _, table := range schema.Tables {
for _, constraint := range table.Constraints {
if constraint.Type == models.ForeignKeyConstraint {
result += w.constraintToDBML(constraint, schema.Name, table.Name)
}
}
}
}
return result
}
// schemaToDBML converts a Schema to DBML format string
func (w *Writer) schemaToDBML(schema *models.Schema) string {
var result string
if schema.Description != "" {
result += fmt.Sprintf("// Schema: %s - %s\n", schema.Name, schema.Description)
}
// Process tables
for _, table := range schema.Tables {
result += w.tableToDBML(table, schema.Name)
result += "\n"
}
return result
}
// tableToDBML converts a Table to DBML format string
func (w *Writer) tableToDBML(t *models.Table, schemaName string) string {
var result string
// Table definition
tableName := fmt.Sprintf("%s.%s", schemaName, t.Name)
result += fmt.Sprintf("Table %s {\n", tableName)
// Add columns
for _, column := range t.Columns {
result += fmt.Sprintf(" %s %s", column.Name, column.Type)
// Add column attributes
attrs := make([]string, 0)
if column.IsPrimaryKey {
attrs = append(attrs, "primary key")
}
if column.NotNull && !column.IsPrimaryKey {
attrs = append(attrs, "not null")
}
if column.AutoIncrement {
attrs = append(attrs, "increment")
}
if column.Default != nil {
attrs = append(attrs, fmt.Sprintf("default: %v", column.Default))
}
if len(attrs) > 0 {
result += fmt.Sprintf(" [%s]", strings.Join(attrs, ", "))
}
if column.Comment != "" {
result += fmt.Sprintf(" // %s", column.Comment)
}
result += "\n"
}
// Add indexes
indexCount := 0
for _, index := range t.Indexes {
if indexCount == 0 {
result += "\n indexes {\n"
}
indexAttrs := make([]string, 0)
if index.Unique {
indexAttrs = append(indexAttrs, "unique")
}
if index.Name != "" {
indexAttrs = append(indexAttrs, fmt.Sprintf("name: '%s'", index.Name))
}
if index.Type != "" {
indexAttrs = append(indexAttrs, fmt.Sprintf("type: %s", index.Type))
}
result += fmt.Sprintf(" (%s)", strings.Join(index.Columns, ", "))
if len(indexAttrs) > 0 {
result += fmt.Sprintf(" [%s]", strings.Join(indexAttrs, ", "))
}
result += "\n"
indexCount++
}
if indexCount > 0 {
result += " }\n"
}
// Add table note
if t.Description != "" || t.Comment != "" {
note := t.Description
if note != "" && t.Comment != "" {
note += " - "
}
note += t.Comment
result += fmt.Sprintf("\n Note: '%s'\n", note)
}
result += "}\n"
return result
}
// constraintToDBML converts a Constraint to DBML format string
func (w *Writer) constraintToDBML(c *models.Constraint, schemaName, tableName string) string {
if c.Type != models.ForeignKeyConstraint || c.ReferencedTable == "" {
return ""
}
fromTable := fmt.Sprintf("%s.%s", schemaName, tableName)
toTable := fmt.Sprintf("%s.%s", c.ReferencedSchema, c.ReferencedTable)
// Determine relationship cardinality
// For foreign keys, it's typically many-to-one
relationship := ">"
fromCols := strings.Join(c.Columns, ", ")
toCols := strings.Join(c.ReferencedColumns, ", ")
result := fmt.Sprintf("Ref: %s.(%s) %s %s.(%s)", fromTable, fromCols, relationship, toTable, toCols)
// Add actions
actions := make([]string, 0)
if c.OnDelete != "" {
actions = append(actions, fmt.Sprintf("ondelete: %s", c.OnDelete))
}
if c.OnUpdate != "" {
actions = append(actions, fmt.Sprintf("onupdate: %s", c.OnUpdate))
}
if len(actions) > 0 {
result += fmt.Sprintf(" [%s]", strings.Join(actions, ", "))
}
result += "\n"
return result
}

View File

@@ -0,0 +1,36 @@
package dctx
import (
"fmt"
"git.warky.dev/wdevs/relspecgo/pkg/models"
"git.warky.dev/wdevs/relspecgo/pkg/writers"
)
// Writer implements the writers.Writer interface for DCTX format
// Note: DCTX is a read-only format used for loading Clarion dictionary files
type Writer struct {
options *writers.WriterOptions
}
// NewWriter creates a new DCTX writer with the given options
func NewWriter(options *writers.WriterOptions) *Writer {
return &Writer{
options: options,
}
}
// WriteDatabase returns an error as DCTX format is read-only
func (w *Writer) WriteDatabase(db *models.Database) error {
return fmt.Errorf("DCTX format is read-only and does not support writing - it is used for loading Clarion dictionary files only")
}
// WriteSchema returns an error as DCTX format is read-only
func (w *Writer) WriteSchema(schema *models.Schema) error {
return fmt.Errorf("DCTX format is read-only and does not support writing - it is used for loading Clarion dictionary files only")
}
// WriteTable returns an error as DCTX format is read-only
func (w *Writer) WriteTable(table *models.Table) error {
return fmt.Errorf("DCTX format is read-only and does not support writing - it is used for loading Clarion dictionary files only")
}

View File

@@ -0,0 +1,77 @@
package drawdb
// DrawDBSchema represents the complete DrawDB JSON structure
type DrawDBSchema struct {
Tables []*DrawDBTable `json:"tables" yaml:"tables" xml:"tables"`
Relationships []*DrawDBRelationship `json:"relationships" yaml:"relationships" xml:"relationships"`
Notes []*DrawDBNote `json:"notes,omitempty" yaml:"notes,omitempty" xml:"notes,omitempty"`
SubjectAreas []*DrawDBArea `json:"subjectAreas,omitempty" yaml:"subjectAreas,omitempty" xml:"subjectAreas,omitempty"`
}
// DrawDBTable represents a table in DrawDB format
type DrawDBTable struct {
ID int `json:"id" yaml:"id" xml:"id"`
Name string `json:"name" yaml:"name" xml:"name"`
Schema string `json:"schema,omitempty" yaml:"schema,omitempty" xml:"schema,omitempty"`
Comment string `json:"comment,omitempty" yaml:"comment,omitempty" xml:"comment,omitempty"`
Color string `json:"color" yaml:"color" xml:"color"`
X int `json:"x" yaml:"x" xml:"x"`
Y int `json:"y" yaml:"y" xml:"y"`
Fields []*DrawDBField `json:"fields" yaml:"fields" xml:"fields"`
Indexes []*DrawDBIndex `json:"indexes,omitempty" yaml:"indexes,omitempty" xml:"indexes,omitempty"`
}
// DrawDBField represents a column/field in DrawDB format
type DrawDBField struct {
ID int `json:"id" yaml:"id" xml:"id"`
Name string `json:"name" yaml:"name" xml:"name"`
Type string `json:"type" yaml:"type" xml:"type"`
Default string `json:"default,omitempty" yaml:"default,omitempty" xml:"default,omitempty"`
Check string `json:"check,omitempty" yaml:"check,omitempty" xml:"check,omitempty"`
Primary bool `json:"primary" yaml:"primary" xml:"primary"`
Unique bool `json:"unique" yaml:"unique" xml:"unique"`
NotNull bool `json:"notNull" yaml:"notNull" xml:"notNull"`
Increment bool `json:"increment" yaml:"increment" xml:"increment"`
Comment string `json:"comment,omitempty" yaml:"comment,omitempty" xml:"comment,omitempty"`
}
// DrawDBIndex represents an index in DrawDB format
type DrawDBIndex struct {
ID int `json:"id" yaml:"id" xml:"id"`
Name string `json:"name" yaml:"name" xml:"name"`
Unique bool `json:"unique" yaml:"unique" xml:"unique"`
Fields []int `json:"fields" yaml:"fields" xml:"fields"` // Field IDs
}
// DrawDBRelationship represents a relationship in DrawDB format
type DrawDBRelationship struct {
ID int `json:"id" yaml:"id" xml:"id"`
Name string `json:"name" yaml:"name" xml:"name"`
StartTableID int `json:"startTableId" yaml:"startTableId" xml:"startTableId"`
EndTableID int `json:"endTableId" yaml:"endTableId" xml:"endTableId"`
StartFieldID int `json:"startFieldId" yaml:"startFieldId" xml:"startFieldId"`
EndFieldID int `json:"endFieldId" yaml:"endFieldId" xml:"endFieldId"`
Cardinality string `json:"cardinality" yaml:"cardinality" xml:"cardinality"` // "One to one", "One to many", "Many to one"
UpdateConstraint string `json:"updateConstraint,omitempty" yaml:"updateConstraint,omitempty" xml:"updateConstraint,omitempty"`
DeleteConstraint string `json:"deleteConstraint,omitempty" yaml:"deleteConstraint,omitempty" xml:"deleteConstraint,omitempty"`
}
// DrawDBNote represents a note in DrawDB format
type DrawDBNote struct {
ID int `json:"id" yaml:"id" xml:"id"`
Content string `json:"content" yaml:"content" xml:"content"`
Color string `json:"color" yaml:"color" xml:"color"`
X int `json:"x" yaml:"x" xml:"x"`
Y int `json:"y" yaml:"y" xml:"y"`
}
// DrawDBArea represents a subject area/grouping in DrawDB format
type DrawDBArea struct {
ID int `json:"id" yaml:"id" xml:"id"`
Name string `json:"name" yaml:"name" xml:"name"`
Color string `json:"color" yaml:"color" xml:"color"`
X int `json:"x" yaml:"x" xml:"x"`
Y int `json:"y" yaml:"y" xml:"y"`
Width int `json:"width" yaml:"width" xml:"width"`
Height int `json:"height" yaml:"height" xml:"height"`
}

View File

@@ -0,0 +1,349 @@
package drawdb
import (
"encoding/json"
"fmt"
"os"
"git.warky.dev/wdevs/relspecgo/pkg/models"
"git.warky.dev/wdevs/relspecgo/pkg/writers"
)
// Writer implements the writers.Writer interface for DrawDB JSON format
type Writer struct {
options *writers.WriterOptions
}
// NewWriter creates a new DrawDB writer with the given options
func NewWriter(options *writers.WriterOptions) *Writer {
return &Writer{
options: options,
}
}
// WriteDatabase writes a Database model to DrawDB JSON format
func (w *Writer) WriteDatabase(db *models.Database) error {
schema := w.databaseToDrawDB(db)
return w.writeJSON(schema)
}
// WriteSchema writes a Schema model to DrawDB JSON format
func (w *Writer) WriteSchema(schema *models.Schema) error {
drawSchema := w.schemaToDrawDB(schema)
return w.writeJSON(drawSchema)
}
// WriteTable writes a Table model to DrawDB JSON format
func (w *Writer) WriteTable(table *models.Table) error {
drawSchema := w.tableToDrawDB(table)
return w.writeJSON(drawSchema)
}
// writeJSON marshals the data to JSON and writes to output
func (w *Writer) writeJSON(data interface{}) error {
jsonData, err := json.MarshalIndent(data, "", " ")
if err != nil {
return fmt.Errorf("failed to marshal to JSON: %w", err)
}
if w.options.OutputPath != "" {
return os.WriteFile(w.options.OutputPath, jsonData, 0644)
}
// If no output path, print to stdout
fmt.Println(string(jsonData))
return nil
}
// databaseToDrawDB converts a Database to DrawDB JSON format
func (w *Writer) databaseToDrawDB(d *models.Database) *DrawDBSchema {
schema := &DrawDBSchema{
Tables: make([]*DrawDBTable, 0),
Relationships: make([]*DrawDBRelationship, 0),
Notes: make([]*DrawDBNote, 0),
SubjectAreas: make([]*DrawDBArea, 0),
}
// Track IDs and mappings
tableID := 0
fieldID := 0
relationshipID := 0
noteID := 0
areaID := 0
// Map to track table name to ID
tableMap := make(map[string]int)
// Map to track field full path to ID
fieldMap := make(map[string]int)
// Position tables in a grid layout
gridX, gridY := 50, 50
colWidth, rowHeight := 300, 200
tablesPerRow := 4
tableIndex := 0
// Create subject areas for schemas
for schemaIdx, schemaModel := range d.Schemas {
if schemaModel.Description != "" || schemaModel.Comment != "" {
note := schemaModel.Description
if note != "" && schemaModel.Comment != "" {
note += "\n"
}
note += schemaModel.Comment
area := &DrawDBArea{
ID: areaID,
Name: schemaModel.Name,
Color: getColorForIndex(schemaIdx),
X: gridX - 20,
Y: gridY - 20,
Width: colWidth*tablesPerRow + 100,
Height: rowHeight*((len(schemaModel.Tables)/tablesPerRow)+1) + 100,
}
schema.SubjectAreas = append(schema.SubjectAreas, area)
areaID++
}
// Process tables in schema
for _, table := range schemaModel.Tables {
drawTable, newFieldID := w.convertTableToDrawDB(table, schemaModel.Name, tableID, fieldID, tableIndex, tablesPerRow, gridX, gridY, colWidth, rowHeight, schemaIdx)
// Store table mapping
tableKey := fmt.Sprintf("%s.%s", schemaModel.Name, table.Name)
tableMap[tableKey] = tableID
// Store field mappings
for _, field := range drawTable.Fields {
fieldKey := fmt.Sprintf("%s.%s.%s", schemaModel.Name, table.Name, field.Name)
fieldMap[fieldKey] = field.ID
}
schema.Tables = append(schema.Tables, drawTable)
fieldID = newFieldID
tableID++
tableIndex++
}
}
// Add relationships
for _, schemaModel := range d.Schemas {
for _, table := range schemaModel.Tables {
for _, constraint := range table.Constraints {
if constraint.Type == models.ForeignKeyConstraint && constraint.ReferencedTable != "" {
startTableKey := fmt.Sprintf("%s.%s", schemaModel.Name, table.Name)
endTableKey := fmt.Sprintf("%s.%s", constraint.ReferencedSchema, constraint.ReferencedTable)
startTableID, startExists := tableMap[startTableKey]
endTableID, endExists := tableMap[endTableKey]
if startExists && endExists && len(constraint.Columns) > 0 && len(constraint.ReferencedColumns) > 0 {
// Find relative field IDs within their tables
startFieldID := 0
endFieldID := 0
for _, t := range schema.Tables {
if t.ID == startTableID {
for idx, f := range t.Fields {
if f.Name == constraint.Columns[0] {
startFieldID = idx
break
}
}
}
if t.ID == endTableID {
for idx, f := range t.Fields {
if f.Name == constraint.ReferencedColumns[0] {
endFieldID = idx
break
}
}
}
}
relationship := &DrawDBRelationship{
ID: relationshipID,
Name: constraint.Name,
StartTableID: startTableID,
EndTableID: endTableID,
StartFieldID: startFieldID,
EndFieldID: endFieldID,
Cardinality: "Many to one",
UpdateConstraint: constraint.OnUpdate,
DeleteConstraint: constraint.OnDelete,
}
schema.Relationships = append(schema.Relationships, relationship)
relationshipID++
}
}
}
}
}
// Add database description as a note
if d.Description != "" || d.Comment != "" {
note := d.Description
if note != "" && d.Comment != "" {
note += "\n"
}
note += d.Comment
schema.Notes = append(schema.Notes, &DrawDBNote{
ID: noteID,
Content: fmt.Sprintf("Database: %s\n\n%s", d.Name, note),
Color: "#ffd93d",
X: 10,
Y: 10,
})
}
return schema
}
// schemaToDrawDB converts a Schema to DrawDB format
func (w *Writer) schemaToDrawDB(schema *models.Schema) *DrawDBSchema {
drawSchema := &DrawDBSchema{
Tables: make([]*DrawDBTable, 0),
Relationships: make([]*DrawDBRelationship, 0),
Notes: make([]*DrawDBNote, 0),
SubjectAreas: make([]*DrawDBArea, 0),
}
tableID := 0
fieldID := 0
gridX, gridY := 50, 50
colWidth, rowHeight := 300, 200
tablesPerRow := 4
for idx, table := range schema.Tables {
drawTable, newFieldID := w.convertTableToDrawDB(table, schema.Name, tableID, fieldID, idx, tablesPerRow, gridX, gridY, colWidth, rowHeight, 0)
drawSchema.Tables = append(drawSchema.Tables, drawTable)
fieldID = newFieldID
tableID++
}
return drawSchema
}
// tableToDrawDB converts a single Table to DrawDB format
func (w *Writer) tableToDrawDB(table *models.Table) *DrawDBSchema {
drawSchema := &DrawDBSchema{
Tables: make([]*DrawDBTable, 0),
Relationships: make([]*DrawDBRelationship, 0),
Notes: make([]*DrawDBNote, 0),
SubjectAreas: make([]*DrawDBArea, 0),
}
drawTable, _ := w.convertTableToDrawDB(table, table.Schema, 0, 0, 0, 4, 50, 50, 300, 200, 0)
drawSchema.Tables = append(drawSchema.Tables, drawTable)
return drawSchema
}
// convertTableToDrawDB converts a table to DrawDB format and returns the table and next field ID
func (w *Writer) convertTableToDrawDB(table *models.Table, schemaName string, tableID, fieldID, tableIndex, tablesPerRow, gridX, gridY, colWidth, rowHeight, colorIndex int) (*DrawDBTable, int) {
// Calculate position
x := gridX + (tableIndex%tablesPerRow)*colWidth
y := gridY + (tableIndex/tablesPerRow)*rowHeight
drawTable := &DrawDBTable{
ID: tableID,
Name: table.Name,
Schema: schemaName,
Comment: table.Description,
Color: getColorForIndex(colorIndex),
X: x,
Y: y,
Fields: make([]*DrawDBField, 0),
Indexes: make([]*DrawDBIndex, 0),
}
// Add fields
for _, column := range table.Columns {
field := &DrawDBField{
ID: fieldID,
Name: column.Name,
Type: formatTypeForDrawDB(column),
Primary: column.IsPrimaryKey,
NotNull: column.NotNull,
Increment: column.AutoIncrement,
Comment: column.Comment,
}
if column.Default != nil {
field.Default = fmt.Sprintf("%v", column.Default)
}
// Check for unique constraint
for _, constraint := range table.Constraints {
if constraint.Type == models.UniqueConstraint {
for _, col := range constraint.Columns {
if col == column.Name {
field.Unique = true
break
}
}
}
}
drawTable.Fields = append(drawTable.Fields, field)
fieldID++
}
// Add indexes
indexID := 0
for _, index := range table.Indexes {
drawIndex := &DrawDBIndex{
ID: indexID,
Name: index.Name,
Unique: index.Unique,
Fields: make([]int, 0),
}
// Map column names to field IDs
for _, colName := range index.Columns {
for idx, field := range drawTable.Fields {
if field.Name == colName {
drawIndex.Fields = append(drawIndex.Fields, idx)
break
}
}
}
drawTable.Indexes = append(drawTable.Indexes, drawIndex)
indexID++
}
return drawTable, fieldID
}
// Helper functions
func formatTypeForDrawDB(column *models.Column) string {
typeStr := column.Type
if column.Length > 0 {
typeStr = fmt.Sprintf("%s(%d)", typeStr, column.Length)
} else if column.Precision > 0 {
if column.Scale > 0 {
typeStr = fmt.Sprintf("%s(%d,%d)", typeStr, column.Precision, column.Scale)
} else {
typeStr = fmt.Sprintf("%s(%d)", typeStr, column.Precision)
}
}
return typeStr
}
func getColorForIndex(index int) string {
colors := []string{
"#6366f1", // indigo
"#8b5cf6", // violet
"#ec4899", // pink
"#f43f5e", // rose
"#14b8a6", // teal
"#06b6d4", // cyan
"#0ea5e9", // sky
"#3b82f6", // blue
}
return colors[index%len(colors)]
}

View File

@@ -0,0 +1,284 @@
package gorm
import (
"strings"
"unicode"
)
// SnakeCaseToPascalCase converts snake_case to PascalCase
// Examples: user_id → UserID, http_request → HTTPRequest
func SnakeCaseToPascalCase(s string) string {
if s == "" {
return ""
}
parts := strings.Split(s, "_")
for i, part := range parts {
parts[i] = capitalize(part)
}
return strings.Join(parts, "")
}
// SnakeCaseToCamelCase converts snake_case to camelCase
// Examples: user_id → userID, http_request → httpRequest
func SnakeCaseToCamelCase(s string) string {
if s == "" {
return ""
}
parts := strings.Split(s, "_")
for i, part := range parts {
if i == 0 {
parts[i] = strings.ToLower(part)
} else {
parts[i] = capitalize(part)
}
}
return strings.Join(parts, "")
}
// PascalCaseToSnakeCase converts PascalCase to snake_case
// Examples: UserID → user_id, HTTPRequest → http_request
func PascalCaseToSnakeCase(s string) string {
if s == "" {
return ""
}
var result strings.Builder
var prevUpper bool
var nextUpper bool
runes := []rune(s)
for i, r := range runes {
isUpper := unicode.IsUpper(r)
if i+1 < len(runes) {
nextUpper = unicode.IsUpper(runes[i+1])
} else {
nextUpper = false
}
if i > 0 && isUpper {
// Add underscore before uppercase letter if:
// 1. Previous char was lowercase, OR
// 2. Next char is lowercase (end of acronym)
if !prevUpper || (nextUpper == false && i+1 < len(runes)) {
result.WriteRune('_')
}
}
result.WriteRune(unicode.ToLower(r))
prevUpper = isUpper
}
return result.String()
}
// capitalize capitalizes the first letter and handles common acronyms
func capitalize(s string) string {
if s == "" {
return ""
}
upper := strings.ToUpper(s)
// Handle common acronyms
acronyms := map[string]bool{
"ID": true,
"UUID": true,
"GUID": true,
"URL": true,
"URI": true,
"HTTP": true,
"HTTPS": true,
"API": true,
"JSON": true,
"XML": true,
"SQL": true,
"HTML": true,
"CSS": true,
"RID": true,
}
if acronyms[upper] {
return upper
}
// Capitalize first letter
runes := []rune(s)
runes[0] = unicode.ToUpper(runes[0])
return string(runes)
}
// Pluralize converts a singular word to plural
// Basic implementation with common rules
func Pluralize(s string) string {
if s == "" {
return ""
}
// Special cases
irregular := map[string]string{
"person": "people",
"child": "children",
"tooth": "teeth",
"foot": "feet",
"man": "men",
"woman": "women",
"mouse": "mice",
"goose": "geese",
"ox": "oxen",
"datum": "data",
"medium": "media",
"analysis": "analyses",
"crisis": "crises",
"status": "statuses",
}
if plural, ok := irregular[strings.ToLower(s)]; ok {
return plural
}
// Already plural (ends in 's' but not 'ss' or 'us')
if strings.HasSuffix(s, "s") && !strings.HasSuffix(s, "ss") && !strings.HasSuffix(s, "us") {
return s
}
// Words ending in s, x, z, ch, sh
if strings.HasSuffix(s, "s") || strings.HasSuffix(s, "x") ||
strings.HasSuffix(s, "z") || strings.HasSuffix(s, "ch") ||
strings.HasSuffix(s, "sh") {
return s + "es"
}
// Words ending in consonant + y
if len(s) >= 2 && strings.HasSuffix(s, "y") {
prevChar := s[len(s)-2]
if !isVowel(prevChar) {
return s[:len(s)-1] + "ies"
}
}
// Words ending in f or fe
if strings.HasSuffix(s, "f") {
return s[:len(s)-1] + "ves"
}
if strings.HasSuffix(s, "fe") {
return s[:len(s)-2] + "ves"
}
// Words ending in consonant + o
if len(s) >= 2 && strings.HasSuffix(s, "o") {
prevChar := s[len(s)-2]
if !isVowel(prevChar) {
return s + "es"
}
}
// Default: add 's'
return s + "s"
}
// Singularize converts a plural word to singular
// Basic implementation with common rules
func Singularize(s string) string {
if s == "" {
return ""
}
// Special cases
irregular := map[string]string{
"people": "person",
"children": "child",
"teeth": "tooth",
"feet": "foot",
"men": "man",
"women": "woman",
"mice": "mouse",
"geese": "goose",
"oxen": "ox",
"data": "datum",
"media": "medium",
"analyses": "analysis",
"crises": "crisis",
"statuses": "status",
}
if singular, ok := irregular[strings.ToLower(s)]; ok {
return singular
}
// Words ending in ies
if strings.HasSuffix(s, "ies") && len(s) > 3 {
return s[:len(s)-3] + "y"
}
// Words ending in ves
if strings.HasSuffix(s, "ves") {
return s[:len(s)-3] + "f"
}
// Words ending in ses, xes, zes, ches, shes
if strings.HasSuffix(s, "ses") || strings.HasSuffix(s, "xes") ||
strings.HasSuffix(s, "zes") || strings.HasSuffix(s, "ches") ||
strings.HasSuffix(s, "shes") {
return s[:len(s)-2]
}
// Words ending in s (not ss)
if strings.HasSuffix(s, "s") && !strings.HasSuffix(s, "ss") {
return s[:len(s)-1]
}
// Already singular
return s
}
// GeneratePrefix generates a 3-letter prefix from a table name
// Examples: process → PRO, mastertask → MTL, user → USR
func GeneratePrefix(tableName string) string {
if tableName == "" {
return "TBL"
}
// Remove common prefixes
tableName = strings.TrimPrefix(tableName, "tbl_")
tableName = strings.TrimPrefix(tableName, "tb_")
// Split by underscore and take first letters
parts := strings.Split(tableName, "_")
var prefix strings.Builder
for _, part := range parts {
if part == "" {
continue
}
prefix.WriteRune(unicode.ToUpper(rune(part[0])))
if prefix.Len() >= 3 {
break
}
}
result := prefix.String()
// If we don't have 3 letters yet, add more from the first part
if len(result) < 3 && len(parts) > 0 {
firstPart := parts[0]
for i := 1; i < len(firstPart) && len(result) < 3; i++ {
result += strings.ToUpper(string(firstPart[i]))
}
}
// Pad with 'X' if still too short
for len(result) < 3 {
result += "X"
}
return result[:3]
}
// isVowel checks if a byte is a vowel
func isVowel(c byte) bool {
c = byte(unicode.ToLower(rune(c)))
return c == 'a' || c == 'e' || c == 'i' || c == 'o' || c == 'u'
}

View File

@@ -0,0 +1,250 @@
package gorm
import (
"sort"
"git.warky.dev/wdevs/relspecgo/pkg/models"
)
// TemplateData represents the data passed to the template for code generation
type TemplateData struct {
PackageName string
Imports []string
Models []*ModelData
Config *MethodConfig
}
// ModelData represents a single model/struct in the template
type ModelData struct {
Name string
TableName string // schema.table format
SchemaName string
TableNameOnly string // just table name without schema
Comment string
Fields []*FieldData
Config *MethodConfig
PrimaryKeyField string // Name of the primary key field
IDColumnName string // Name of the ID column in database
Prefix string // 3-letter prefix
}
// FieldData represents a single field in a struct
type FieldData struct {
Name string // Go field name (PascalCase)
Type string // Go type
GormTag string // Complete gorm tag
JSONTag string // JSON tag
Comment string // Field comment
}
// MethodConfig controls which helper methods to generate
type MethodConfig struct {
GenerateTableName bool
GenerateSchemaName bool
GenerateTableNameOnly bool
GenerateGetID bool
GenerateGetIDStr bool
GenerateSetID bool
GenerateUpdateID bool
GenerateGetIDName bool
GenerateGetPrefix bool
}
// DefaultMethodConfig returns a MethodConfig with all methods enabled
func DefaultMethodConfig() *MethodConfig {
return &MethodConfig{
GenerateTableName: true,
GenerateSchemaName: true,
GenerateTableNameOnly: true,
GenerateGetID: true,
GenerateGetIDStr: true,
GenerateSetID: true,
GenerateUpdateID: true,
GenerateGetIDName: true,
GenerateGetPrefix: true,
}
}
// NewTemplateData creates a new TemplateData with the given package name and config
func NewTemplateData(packageName string, config *MethodConfig) *TemplateData {
if config == nil {
config = DefaultMethodConfig()
}
return &TemplateData{
PackageName: packageName,
Imports: make([]string, 0),
Models: make([]*ModelData, 0),
Config: config,
}
}
// AddModel adds a model to the template data
func (td *TemplateData) AddModel(model *ModelData) {
model.Config = td.Config
td.Models = append(td.Models, model)
}
// AddImport adds an import to the template data (deduplicates automatically)
func (td *TemplateData) AddImport(importPath string) {
// Check if already exists
for _, imp := range td.Imports {
if imp == importPath {
return
}
}
td.Imports = append(td.Imports, importPath)
}
// FinalizeImports sorts and organizes imports
func (td *TemplateData) FinalizeImports() {
// Sort imports alphabetically
sort.Strings(td.Imports)
}
// NewModelData creates a new ModelData from a models.Table
func NewModelData(table *models.Table, schema string, typeMapper *TypeMapper) *ModelData {
tableName := table.Name
if schema != "" {
tableName = schema + "." + table.Name
}
// Generate model name: singularize and convert to PascalCase
singularTable := Singularize(table.Name)
modelName := SnakeCaseToPascalCase(singularTable)
// Add "Model" prefix if not already present
if !hasModelPrefix(modelName) {
modelName = "Model" + modelName
}
model := &ModelData{
Name: modelName,
TableName: tableName,
SchemaName: schema,
TableNameOnly: table.Name,
Comment: formatComment(table.Description, table.Comment),
Fields: make([]*FieldData, 0),
Prefix: GeneratePrefix(table.Name),
}
// Find primary key
for _, col := range table.Columns {
if col.IsPrimaryKey {
model.PrimaryKeyField = SnakeCaseToPascalCase(col.Name)
model.IDColumnName = col.Name
break
}
}
// Convert columns to fields (sorted by sequence or name)
columns := sortColumns(table.Columns)
for _, col := range columns {
field := columnToField(col, table, typeMapper)
model.Fields = append(model.Fields, field)
}
return model
}
// columnToField converts a models.Column to FieldData
func columnToField(col *models.Column, table *models.Table, typeMapper *TypeMapper) *FieldData {
fieldName := SnakeCaseToPascalCase(col.Name)
goType := typeMapper.SQLTypeToGoType(col.Type, col.NotNull)
gormTag := typeMapper.BuildGormTag(col, table)
jsonTag := col.Name // Use column name for JSON tag
return &FieldData{
Name: fieldName,
Type: goType,
GormTag: gormTag,
JSONTag: jsonTag,
Comment: formatComment(col.Description, col.Comment),
}
}
// AddRelationshipField adds a relationship field to the model
func (md *ModelData) AddRelationshipField(field *FieldData) {
md.Fields = append(md.Fields, field)
}
// formatComment combines description and comment into a single comment string
func formatComment(description, comment string) string {
if description != "" && comment != "" {
return description + " - " + comment
}
if description != "" {
return description
}
return comment
}
// hasModelPrefix checks if a name already has "Model" prefix
func hasModelPrefix(name string) bool {
return len(name) >= 5 && name[:5] == "Model"
}
// sortColumns sorts columns by sequence, then by name
func sortColumns(columns map[string]*models.Column) []*models.Column {
result := make([]*models.Column, 0, len(columns))
for _, col := range columns {
result = append(result, col)
}
sort.Slice(result, func(i, j int) bool {
// Sort by sequence if both have it
if result[i].Sequence > 0 && result[j].Sequence > 0 {
return result[i].Sequence < result[j].Sequence
}
// Put primary keys first
if result[i].IsPrimaryKey != result[j].IsPrimaryKey {
return result[i].IsPrimaryKey
}
// Otherwise sort alphabetically
return result[i].Name < result[j].Name
})
return result
}
// LoadMethodConfigFromMetadata loads method configuration from metadata map
func LoadMethodConfigFromMetadata(metadata map[string]interface{}) *MethodConfig {
config := DefaultMethodConfig()
if metadata == nil {
return config
}
// Load each setting from metadata if present
if val, ok := metadata["generate_table_name"].(bool); ok {
config.GenerateTableName = val
}
if val, ok := metadata["generate_schema_name"].(bool); ok {
config.GenerateSchemaName = val
}
if val, ok := metadata["generate_table_name_only"].(bool); ok {
config.GenerateTableNameOnly = val
}
if val, ok := metadata["generate_get_id"].(bool); ok {
config.GenerateGetID = val
}
if val, ok := metadata["generate_get_id_str"].(bool); ok {
config.GenerateGetIDStr = val
}
if val, ok := metadata["generate_set_id"].(bool); ok {
config.GenerateSetID = val
}
if val, ok := metadata["generate_update_id"].(bool); ok {
config.GenerateUpdateID = val
}
if val, ok := metadata["generate_get_id_name"].(bool); ok {
config.GenerateGetIDName = val
}
if val, ok := metadata["generate_get_prefix"].(bool); ok {
config.GenerateGetPrefix = val
}
return config
}

View File

@@ -0,0 +1,109 @@
package gorm
import (
"bytes"
"text/template"
)
// modelTemplate defines the template for generating GORM models
const modelTemplate = `// Code generated by relspecgo. DO NOT EDIT.
package {{.PackageName}}
{{if .Imports -}}
import (
{{range .Imports -}}
{{.}}
{{end -}}
)
{{end}}
{{range .Models}}
{{if .Comment}}// {{.Comment}}{{end}}
type {{.Name}} struct {
{{- range .Fields}}
{{.Name}} {{.Type}} ` + "`gorm:\"{{.GormTag}}\" json:\"{{.JSONTag}}\"`" + `{{if .Comment}} // {{.Comment}}{{end}}
{{- end}}
}
{{if .Config.GenerateTableName}}
// TableName returns the table name for {{.Name}}
func (m {{.Name}}) TableName() string {
return "{{.TableName}}"
}
{{end}}
{{if .Config.GenerateTableNameOnly}}
// TableNameOnly returns the table name without schema for {{.Name}}
func (m {{.Name}}) TableNameOnly() string {
return "{{.TableNameOnly}}"
}
{{end}}
{{if .Config.GenerateSchemaName}}
// SchemaName returns the schema name for {{.Name}}
func (m {{.Name}}) SchemaName() string {
return "{{.SchemaName}}"
}
{{end}}
{{if and .Config.GenerateGetID .PrimaryKeyField}}
// GetID returns the primary key value
func (m {{.Name}}) GetID() int64 {
return int64(m.{{.PrimaryKeyField}})
}
{{end}}
{{if and .Config.GenerateGetIDStr .PrimaryKeyField}}
// GetIDStr returns the primary key as a string
func (m {{.Name}}) GetIDStr() string {
return fmt.Sprintf("%d", m.{{.PrimaryKeyField}})
}
{{end}}
{{if and .Config.GenerateSetID .PrimaryKeyField}}
// SetID sets the primary key value
func (m {{.Name}}) SetID(newid int64) {
m.UpdateID(newid)
}
{{end}}
{{if and .Config.GenerateUpdateID .PrimaryKeyField}}
// UpdateID updates the primary key value
func (m *{{.Name}}) UpdateID(newid int64) {
m.{{.PrimaryKeyField}} = int32(newid)
}
{{end}}
{{if and .Config.GenerateGetIDName .IDColumnName}}
// GetIDName returns the name of the primary key column
func (m {{.Name}}) GetIDName() string {
return "{{.IDColumnName}}"
}
{{end}}
{{if .Config.GenerateGetPrefix}}
// GetPrefix returns the table prefix
func (m {{.Name}}) GetPrefix() string {
return "{{.Prefix}}"
}
{{end}}
{{end -}}
`
// Templates holds the parsed templates
type Templates struct {
modelTmpl *template.Template
}
// NewTemplates creates and parses the templates
func NewTemplates() (*Templates, error) {
modelTmpl, err := template.New("model").Parse(modelTemplate)
if err != nil {
return nil, err
}
return &Templates{
modelTmpl: modelTmpl,
}, nil
}
// GenerateCode executes the template with the given data
func (t *Templates) GenerateCode(data *TemplateData) (string, error) {
var buf bytes.Buffer
err := t.modelTmpl.Execute(&buf, data)
if err != nil {
return "", err
}
return buf.String(), nil
}

View File

@@ -0,0 +1,335 @@
package gorm
import (
"fmt"
"strings"
"git.warky.dev/wdevs/relspecgo/pkg/models"
)
// TypeMapper handles type conversions between SQL and Go types
type TypeMapper struct {
// Package alias for sql_types import
sqlTypesAlias string
}
// NewTypeMapper creates a new TypeMapper with default settings
func NewTypeMapper() *TypeMapper {
return &TypeMapper{
sqlTypesAlias: "sql_types",
}
}
// SQLTypeToGoType converts a SQL type to its Go equivalent
// Handles nullable types using ResolveSpec sql_types package
func (tm *TypeMapper) SQLTypeToGoType(sqlType string, notNull bool) string {
// Normalize SQL type (lowercase, remove length/precision)
baseType := tm.extractBaseType(sqlType)
// If not null, use base Go types
if notNull {
return tm.baseGoType(baseType)
}
// For nullable fields, use sql_types
return tm.nullableGoType(baseType)
}
// extractBaseType extracts the base type from a SQL type string
// Examples: varchar(100) → varchar, numeric(10,2) → numeric
func (tm *TypeMapper) extractBaseType(sqlType string) string {
sqlType = strings.ToLower(strings.TrimSpace(sqlType))
// Remove everything after '('
if idx := strings.Index(sqlType, "("); idx > 0 {
sqlType = sqlType[:idx]
}
return sqlType
}
// baseGoType returns the base Go type for a SQL type (not null)
func (tm *TypeMapper) baseGoType(sqlType string) string {
typeMap := map[string]string{
// Integer types
"integer": "int32",
"int": "int32",
"int4": "int32",
"smallint": "int16",
"int2": "int16",
"bigint": "int64",
"int8": "int64",
"serial": "int32",
"bigserial": "int64",
"smallserial": "int16",
// String types
"text": "string",
"varchar": "string",
"char": "string",
"character": "string",
"citext": "string",
"bpchar": "string",
// Boolean
"boolean": "bool",
"bool": "bool",
// Float types
"real": "float32",
"float4": "float32",
"double precision": "float64",
"float8": "float64",
"numeric": "float64",
"decimal": "float64",
// Date/Time types
"timestamp": "time.Time",
"timestamp without time zone": "time.Time",
"timestamp with time zone": "time.Time",
"timestamptz": "time.Time",
"date": "time.Time",
"time": "time.Time",
"time without time zone": "time.Time",
"time with time zone": "time.Time",
"timetz": "time.Time",
// Binary
"bytea": "[]byte",
// UUID
"uuid": "string",
// JSON
"json": "string",
"jsonb": "string",
// Network
"inet": "string",
"cidr": "string",
"macaddr": "string",
// Other
"money": "float64",
}
if goType, ok := typeMap[sqlType]; ok {
return goType
}
// Default to string for unknown types
return "string"
}
// nullableGoType returns the nullable Go type using sql_types package
func (tm *TypeMapper) nullableGoType(sqlType string) string {
typeMap := map[string]string{
// Integer types
"integer": tm.sqlTypesAlias + ".SqlInt32",
"int": tm.sqlTypesAlias + ".SqlInt32",
"int4": tm.sqlTypesAlias + ".SqlInt32",
"smallint": tm.sqlTypesAlias + ".SqlInt16",
"int2": tm.sqlTypesAlias + ".SqlInt16",
"bigint": tm.sqlTypesAlias + ".SqlInt64",
"int8": tm.sqlTypesAlias + ".SqlInt64",
"serial": tm.sqlTypesAlias + ".SqlInt32",
"bigserial": tm.sqlTypesAlias + ".SqlInt64",
"smallserial": tm.sqlTypesAlias + ".SqlInt16",
// String types
"text": tm.sqlTypesAlias + ".SqlString",
"varchar": tm.sqlTypesAlias + ".SqlString",
"char": tm.sqlTypesAlias + ".SqlString",
"character": tm.sqlTypesAlias + ".SqlString",
"citext": tm.sqlTypesAlias + ".SqlString",
"bpchar": tm.sqlTypesAlias + ".SqlString",
// Boolean
"boolean": tm.sqlTypesAlias + ".SqlBool",
"bool": tm.sqlTypesAlias + ".SqlBool",
// Float types
"real": tm.sqlTypesAlias + ".SqlFloat32",
"float4": tm.sqlTypesAlias + ".SqlFloat32",
"double precision": tm.sqlTypesAlias + ".SqlFloat64",
"float8": tm.sqlTypesAlias + ".SqlFloat64",
"numeric": tm.sqlTypesAlias + ".SqlFloat64",
"decimal": tm.sqlTypesAlias + ".SqlFloat64",
// Date/Time types
"timestamp": tm.sqlTypesAlias + ".SqlTime",
"timestamp without time zone": tm.sqlTypesAlias + ".SqlTime",
"timestamp with time zone": tm.sqlTypesAlias + ".SqlTime",
"timestamptz": tm.sqlTypesAlias + ".SqlTime",
"date": tm.sqlTypesAlias + ".SqlDate",
"time": tm.sqlTypesAlias + ".SqlTime",
"time without time zone": tm.sqlTypesAlias + ".SqlTime",
"time with time zone": tm.sqlTypesAlias + ".SqlTime",
"timetz": tm.sqlTypesAlias + ".SqlTime",
// Binary
"bytea": "[]byte", // No nullable version needed
// UUID
"uuid": tm.sqlTypesAlias + ".SqlUUID",
// JSON
"json": tm.sqlTypesAlias + ".SqlString",
"jsonb": tm.sqlTypesAlias + ".SqlString",
// Network
"inet": tm.sqlTypesAlias + ".SqlString",
"cidr": tm.sqlTypesAlias + ".SqlString",
"macaddr": tm.sqlTypesAlias + ".SqlString",
// Other
"money": tm.sqlTypesAlias + ".SqlFloat64",
}
if goType, ok := typeMap[sqlType]; ok {
return goType
}
// Default to SqlString for unknown types
return tm.sqlTypesAlias + ".SqlString"
}
// BuildGormTag generates a complete GORM tag string for a column
func (tm *TypeMapper) BuildGormTag(column *models.Column, table *models.Table) string {
var parts []string
// Always include column name (lowercase as per user requirement)
parts = append(parts, fmt.Sprintf("column:%s", column.Name))
// Add type if specified
if column.Type != "" {
// Include length, precision, scale if present
typeStr := column.Type
if column.Length > 0 {
typeStr = fmt.Sprintf("%s(%d)", typeStr, column.Length)
} else if column.Precision > 0 {
if column.Scale > 0 {
typeStr = fmt.Sprintf("%s(%d,%d)", typeStr, column.Precision, column.Scale)
} else {
typeStr = fmt.Sprintf("%s(%d)", typeStr, column.Precision)
}
}
parts = append(parts, fmt.Sprintf("type:%s", typeStr))
}
// Primary key
if column.IsPrimaryKey {
parts = append(parts, "primaryKey")
}
// Auto increment
if column.AutoIncrement {
parts = append(parts, "autoIncrement")
}
// Not null (skip if primary key, as it's implied)
if column.NotNull && !column.IsPrimaryKey {
parts = append(parts, "not null")
}
// Default value
if column.Default != nil {
parts = append(parts, fmt.Sprintf("default:%v", column.Default))
}
// Check for unique constraint
if table != nil {
for _, constraint := range table.Constraints {
if constraint.Type == models.UniqueConstraint {
for _, col := range constraint.Columns {
if col == column.Name {
if constraint.Name != "" {
parts = append(parts, fmt.Sprintf("uniqueIndex:%s", constraint.Name))
} else {
parts = append(parts, "unique")
}
break
}
}
}
}
// Check for index
for _, index := range table.Indexes {
for _, col := range index.Columns {
if col == column.Name {
if index.Unique {
if index.Name != "" {
parts = append(parts, fmt.Sprintf("uniqueIndex:%s", index.Name))
} else {
parts = append(parts, "unique")
}
} else {
if index.Name != "" {
parts = append(parts, fmt.Sprintf("index:%s", index.Name))
} else {
parts = append(parts, "index")
}
}
break
}
}
}
}
return strings.Join(parts, ";")
}
// BuildRelationshipTag generates GORM tag for relationship fields
func (tm *TypeMapper) BuildRelationshipTag(constraint *models.Constraint, isParent bool) string {
var parts []string
if !isParent {
// Child side (has foreign key)
if len(constraint.Columns) > 0 && len(constraint.ReferencedColumns) > 0 {
// foreignKey points to the field name in this struct
fkFieldName := SnakeCaseToPascalCase(constraint.Columns[0])
parts = append(parts, fmt.Sprintf("foreignKey:%s", fkFieldName))
// references points to the field name in the other struct
refFieldName := SnakeCaseToPascalCase(constraint.ReferencedColumns[0])
parts = append(parts, fmt.Sprintf("references:%s", refFieldName))
}
} else {
// Parent side (being referenced)
if len(constraint.Columns) > 0 {
fkFieldName := SnakeCaseToPascalCase(constraint.Columns[0])
parts = append(parts, fmt.Sprintf("foreignKey:%s", fkFieldName))
}
}
// Add constraint actions
if constraint.OnDelete != "" {
parts = append(parts, fmt.Sprintf("constraint:OnDelete:%s", strings.ToUpper(constraint.OnDelete)))
}
if constraint.OnUpdate != "" {
if len(parts) > 0 && strings.Contains(parts[len(parts)-1], "constraint:") {
// Append to existing constraint
parts[len(parts)-1] += fmt.Sprintf(",OnUpdate:%s", strings.ToUpper(constraint.OnUpdate))
} else {
parts = append(parts, fmt.Sprintf("constraint:OnUpdate:%s", strings.ToUpper(constraint.OnUpdate)))
}
}
return strings.Join(parts, ";")
}
// NeedsTimeImport checks if the Go type requires time package import
func (tm *TypeMapper) NeedsTimeImport(goType string) bool {
return strings.Contains(goType, "time.Time")
}
// NeedsFmtImport checks if we need fmt import (for GetIDStr method)
func (tm *TypeMapper) NeedsFmtImport(generateGetIDStr bool) bool {
return generateGetIDStr
}
// GetSQLTypesImport returns the import path for sql_types
func (tm *TypeMapper) GetSQLTypesImport() string {
return "github.com/bitechdev/ResolveSpec/pkg/common/sql_types"
}

324
pkg/writers/gorm/writer.go Normal file
View File

@@ -0,0 +1,324 @@
package gorm
import (
"fmt"
"go/format"
"os"
"path/filepath"
"strings"
"git.warky.dev/wdevs/relspecgo/pkg/models"
"git.warky.dev/wdevs/relspecgo/pkg/writers"
)
// Writer implements the writers.Writer interface for GORM models
type Writer struct {
options *writers.WriterOptions
typeMapper *TypeMapper
templates *Templates
config *MethodConfig
}
// NewWriter creates a new GORM writer with the given options
func NewWriter(options *writers.WriterOptions) *Writer {
w := &Writer{
options: options,
typeMapper: NewTypeMapper(),
config: LoadMethodConfigFromMetadata(options.Metadata),
}
// Initialize templates
tmpl, err := NewTemplates()
if err != nil {
// Should not happen with embedded templates
panic(fmt.Sprintf("failed to initialize templates: %v", err))
}
w.templates = tmpl
return w
}
// WriteDatabase writes a complete database as GORM models
func (w *Writer) WriteDatabase(db *models.Database) error {
// Check if multi-file mode is enabled
multiFile := false
if w.options.Metadata != nil {
if mf, ok := w.options.Metadata["multi_file"].(bool); ok {
multiFile = mf
}
}
if multiFile {
return w.writeMultiFile(db)
}
return w.writeSingleFile(db)
}
// WriteSchema writes a schema as GORM models
func (w *Writer) WriteSchema(schema *models.Schema) error {
// Create a temporary database with just this schema
db := models.InitDatabase(schema.Name)
db.Schemas = []*models.Schema{schema}
return w.WriteDatabase(db)
}
// WriteTable writes a single table as a GORM model
func (w *Writer) WriteTable(table *models.Table) error {
// Create a temporary schema and database
schema := models.InitSchema(table.Schema)
schema.Tables = []*models.Table{table}
db := models.InitDatabase(schema.Name)
db.Schemas = []*models.Schema{schema}
return w.WriteDatabase(db)
}
// writeSingleFile writes all models to a single file
func (w *Writer) writeSingleFile(db *models.Database) error {
packageName := w.getPackageName()
templateData := NewTemplateData(packageName, w.config)
// Add sql_types import (always needed for nullable types)
templateData.AddImport(fmt.Sprintf("sql_types \"%s\"", w.typeMapper.GetSQLTypesImport()))
// Collect all models
for _, schema := range db.Schemas {
for _, table := range schema.Tables {
modelData := NewModelData(table, schema.Name, w.typeMapper)
// Add relationship fields
w.addRelationshipFields(modelData, table, schema, db)
templateData.AddModel(modelData)
// Check if we need time import
for _, field := range modelData.Fields {
if w.typeMapper.NeedsTimeImport(field.Type) {
templateData.AddImport("\"time\"")
}
}
}
}
// Add fmt import if GetIDStr is enabled
if w.config.GenerateGetIDStr {
templateData.AddImport("\"fmt\"")
}
// Finalize imports
templateData.FinalizeImports()
// Generate code
code, err := w.templates.GenerateCode(templateData)
if err != nil {
return fmt.Errorf("failed to generate code: %w", err)
}
// Format code
formatted, err := w.formatCode(code)
if err != nil {
// Return unformatted code with warning
fmt.Fprintf(os.Stderr, "Warning: failed to format code: %v\n", err)
formatted = code
}
// Write output
return w.writeOutput(formatted)
}
// writeMultiFile writes each table to a separate file
func (w *Writer) writeMultiFile(db *models.Database) error {
packageName := w.getPackageName()
// Ensure output path is a directory
if w.options.OutputPath == "" {
return fmt.Errorf("output path is required for multi-file mode")
}
// Create output directory if it doesn't exist
if err := os.MkdirAll(w.options.OutputPath, 0755); err != nil {
return fmt.Errorf("failed to create output directory: %w", err)
}
// Generate a file for each table
for _, schema := range db.Schemas {
for _, table := range schema.Tables {
// Create template data for this single table
templateData := NewTemplateData(packageName, w.config)
// Add sql_types import
templateData.AddImport(fmt.Sprintf("sql_types \"%s\"", w.typeMapper.GetSQLTypesImport()))
// Create model data
modelData := NewModelData(table, schema.Name, w.typeMapper)
// Add relationship fields
w.addRelationshipFields(modelData, table, schema, db)
templateData.AddModel(modelData)
// Check if we need time import
for _, field := range modelData.Fields {
if w.typeMapper.NeedsTimeImport(field.Type) {
templateData.AddImport("\"time\"")
}
}
// Add fmt import if GetIDStr is enabled
if w.config.GenerateGetIDStr {
templateData.AddImport("\"fmt\"")
}
// Finalize imports
templateData.FinalizeImports()
// Generate code
code, err := w.templates.GenerateCode(templateData)
if err != nil {
return fmt.Errorf("failed to generate code for table %s: %w", table.Name, err)
}
// Format code
formatted, err := w.formatCode(code)
if err != nil {
fmt.Fprintf(os.Stderr, "Warning: failed to format code for %s: %v\n", table.Name, err)
formatted = code
}
// Generate filename: sql_{schema}_{table}.go
filename := fmt.Sprintf("sql_%s_%s.go", schema.Name, table.Name)
filepath := filepath.Join(w.options.OutputPath, filename)
// Write file
if err := os.WriteFile(filepath, []byte(formatted), 0644); err != nil {
return fmt.Errorf("failed to write file %s: %w", filename, err)
}
}
}
return nil
}
// addRelationshipFields adds relationship fields to the model based on foreign keys
func (w *Writer) addRelationshipFields(modelData *ModelData, table *models.Table, schema *models.Schema, db *models.Database) {
// For each foreign key in this table, add a belongs-to relationship
for _, constraint := range table.Constraints {
if constraint.Type != models.ForeignKeyConstraint {
continue
}
// Find the referenced table
refTable := w.findTable(constraint.ReferencedSchema, constraint.ReferencedTable, db)
if refTable == nil {
continue
}
// Create relationship field (belongs-to)
refModelName := w.getModelName(constraint.ReferencedTable)
fieldName := w.generateRelationshipFieldName(constraint.ReferencedTable)
relationTag := w.typeMapper.BuildRelationshipTag(constraint, false)
modelData.AddRelationshipField(&FieldData{
Name: fieldName,
Type: "*" + refModelName, // Pointer type
GormTag: relationTag,
JSONTag: strings.ToLower(fieldName) + ",omitempty",
Comment: fmt.Sprintf("Belongs to %s", refModelName),
})
}
// For each table that references this table, add a has-many relationship
for _, otherSchema := range db.Schemas {
for _, otherTable := range otherSchema.Tables {
if otherTable.Name == table.Name && otherSchema.Name == schema.Name {
continue // Skip self
}
for _, constraint := range otherTable.Constraints {
if constraint.Type != models.ForeignKeyConstraint {
continue
}
// Check if this constraint references our table
if constraint.ReferencedTable == table.Name && constraint.ReferencedSchema == schema.Name {
// Add has-many relationship
otherModelName := w.getModelName(otherTable.Name)
fieldName := w.generateRelationshipFieldName(otherTable.Name) + "s" // Pluralize
relationTag := w.typeMapper.BuildRelationshipTag(constraint, true)
modelData.AddRelationshipField(&FieldData{
Name: fieldName,
Type: "[]*" + otherModelName, // Slice of pointers
GormTag: relationTag,
JSONTag: strings.ToLower(fieldName) + ",omitempty",
Comment: fmt.Sprintf("Has many %s", otherModelName),
})
}
}
}
}
}
// findTable finds a table by schema and name in the database
func (w *Writer) findTable(schemaName, tableName string, db *models.Database) *models.Table {
for _, schema := range db.Schemas {
if schema.Name != schemaName {
continue
}
for _, table := range schema.Tables {
if table.Name == tableName {
return table
}
}
}
return nil
}
// getModelName generates the model name from a table name
func (w *Writer) getModelName(tableName string) string {
singular := Singularize(tableName)
modelName := SnakeCaseToPascalCase(singular)
if !hasModelPrefix(modelName) {
modelName = "Model" + modelName
}
return modelName
}
// generateRelationshipFieldName generates a field name for a relationship
func (w *Writer) generateRelationshipFieldName(tableName string) string {
// Use just the prefix (3 letters) for relationship fields
return GeneratePrefix(tableName)
}
// getPackageName returns the package name from options or defaults to "models"
func (w *Writer) getPackageName() string {
if w.options.PackageName != "" {
return w.options.PackageName
}
return "models"
}
// formatCode formats Go code using gofmt
func (w *Writer) formatCode(code string) (string, error) {
formatted, err := format.Source([]byte(code))
if err != nil {
return "", fmt.Errorf("format error: %w", err)
}
return string(formatted), nil
}
// writeOutput writes the content to file or stdout
func (w *Writer) writeOutput(content string) error {
if w.options.OutputPath != "" {
return os.WriteFile(w.options.OutputPath, []byte(content), 0644)
}
// Print to stdout
fmt.Print(content)
return nil
}

View File

@@ -0,0 +1,243 @@
package gorm
import (
"os"
"path/filepath"
"strings"
"testing"
"git.warky.dev/wdevs/relspecgo/pkg/models"
"git.warky.dev/wdevs/relspecgo/pkg/writers"
)
func TestWriter_WriteTable(t *testing.T) {
// Create a simple table
table := models.InitTable("users", "public")
table.Columns["id"] = &models.Column{
Name: "id",
Type: "bigint",
NotNull: true,
IsPrimaryKey: true,
AutoIncrement: true,
Sequence: 1,
}
table.Columns["email"] = &models.Column{
Name: "email",
Type: "varchar",
Length: 255,
NotNull: false,
Sequence: 2,
}
table.Columns["created_at"] = &models.Column{
Name: "created_at",
Type: "timestamp",
NotNull: true,
Sequence: 3,
}
// Create writer
opts := &writers.WriterOptions{
PackageName: "models",
Metadata: map[string]interface{}{
"generate_table_name": true,
"generate_get_id": true,
},
}
writer := NewWriter(opts)
// Write to temporary file
tmpDir := t.TempDir()
opts.OutputPath = filepath.Join(tmpDir, "test.go")
err := writer.WriteTable(table)
if err != nil {
t.Fatalf("WriteTable failed: %v", err)
}
// Read the generated file
content, err := os.ReadFile(opts.OutputPath)
if err != nil {
t.Fatalf("Failed to read generated file: %v", err)
}
generated := string(content)
// Verify key elements are present
expectations := []string{
"package models",
"type ModelUser struct",
"ID",
"int64",
"Email",
"sql_types.SqlString",
"CreatedAt",
"time.Time",
"gorm:\"column:id",
"gorm:\"column:email",
"func (m ModelUser) TableName() string",
"return \"public.users\"",
"func (m ModelUser) GetID() int64",
}
for _, expected := range expectations {
if !strings.Contains(generated, expected) {
t.Errorf("Generated code missing expected content: %q\nGenerated:\n%s", expected, generated)
}
}
}
func TestWriter_WriteDatabase_MultiFile(t *testing.T) {
// Create a database with two tables
db := models.InitDatabase("testdb")
schema := models.InitSchema("public")
// Table 1: users
users := models.InitTable("users", "public")
users.Columns["id"] = &models.Column{
Name: "id",
Type: "bigint",
NotNull: true,
IsPrimaryKey: true,
}
schema.Tables = append(schema.Tables, users)
// Table 2: posts
posts := models.InitTable("posts", "public")
posts.Columns["id"] = &models.Column{
Name: "id",
Type: "bigint",
NotNull: true,
IsPrimaryKey: true,
}
posts.Columns["user_id"] = &models.Column{
Name: "user_id",
Type: "bigint",
NotNull: true,
}
posts.Constraints["fk_user"] = &models.Constraint{
Name: "fk_user",
Type: models.ForeignKeyConstraint,
Columns: []string{"user_id"},
ReferencedTable: "users",
ReferencedSchema: "public",
ReferencedColumns: []string{"id"},
OnDelete: "CASCADE",
}
schema.Tables = append(schema.Tables, posts)
db.Schemas = append(db.Schemas, schema)
// Create writer with multi-file mode
tmpDir := t.TempDir()
opts := &writers.WriterOptions{
PackageName: "models",
OutputPath: tmpDir,
Metadata: map[string]interface{}{
"multi_file": true,
},
}
writer := NewWriter(opts)
err := writer.WriteDatabase(db)
if err != nil {
t.Fatalf("WriteDatabase failed: %v", err)
}
// Verify two files were created
expectedFiles := []string{
"sql_public_users.go",
"sql_public_posts.go",
}
for _, filename := range expectedFiles {
filepath := filepath.Join(tmpDir, filename)
if _, err := os.Stat(filepath); os.IsNotExist(err) {
t.Errorf("Expected file not created: %s", filename)
}
}
// Check posts file contains relationship
postsContent, err := os.ReadFile(filepath.Join(tmpDir, "sql_public_posts.go"))
if err != nil {
t.Fatalf("Failed to read posts file: %v", err)
}
if !strings.Contains(string(postsContent), "USE *ModelUser") {
// Relationship field should be present
t.Logf("Posts content:\n%s", string(postsContent))
}
}
func TestNameConverter_SnakeCaseToPascalCase(t *testing.T) {
tests := []struct {
input string
expected string
}{
{"user_id", "UserID"},
{"http_request", "HTTPRequest"},
{"user_profiles", "UserProfiles"},
{"guid", "GUID"},
{"rid_process", "RIDProcess"},
}
for _, tt := range tests {
t.Run(tt.input, func(t *testing.T) {
result := SnakeCaseToPascalCase(tt.input)
if result != tt.expected {
t.Errorf("SnakeCaseToPascalCase(%q) = %q, want %q", tt.input, result, tt.expected)
}
})
}
}
func TestNameConverter_Pluralize(t *testing.T) {
tests := []struct {
input string
expected string
}{
{"user", "users"},
{"process", "processes"},
{"child", "children"},
{"person", "people"},
{"status", "statuses"},
}
for _, tt := range tests {
t.Run(tt.input, func(t *testing.T) {
result := Pluralize(tt.input)
if result != tt.expected {
t.Errorf("Pluralize(%q) = %q, want %q", tt.input, result, tt.expected)
}
})
}
}
func TestTypeMapper_SQLTypeToGoType(t *testing.T) {
mapper := NewTypeMapper()
tests := []struct {
sqlType string
notNull bool
want string
}{
{"bigint", true, "int64"},
{"bigint", false, "sql_types.SqlInt64"},
{"varchar", true, "string"},
{"varchar", false, "sql_types.SqlString"},
{"timestamp", true, "time.Time"},
{"timestamp", false, "sql_types.SqlTime"},
{"boolean", true, "bool"},
{"boolean", false, "sql_types.SqlBool"},
}
for _, tt := range tests {
t.Run(tt.sqlType, func(t *testing.T) {
result := mapper.SQLTypeToGoType(tt.sqlType, tt.notNull)
if result != tt.want {
t.Errorf("SQLTypeToGoType(%q, %v) = %q, want %q", tt.sqlType, tt.notNull, result, tt.want)
}
})
}
}

View File

@@ -5,10 +5,16 @@ import (
)
// Writer defines the interface for writing database specifications
// to various output formats
// to various output formats at different granularity levels
type Writer interface {
// Write takes a Database model and writes it to the desired format
Write(db *models.Database) error
// WriteDatabase takes a Database model and writes it to the desired format
WriteDatabase(db *models.Database) error
// WriteSchema takes a Schema model and writes it to the desired format
WriteSchema(schema *models.Schema) error
// WriteTable takes a Table model and writes it to the desired format
WriteTable(table *models.Table) error
}
// WriterOptions contains common options for writers