Compare commits
17 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
| e0e7b64c69 | |||
| 4181cb1fbd | |||
| 120ffc6a5a | |||
| b20ad35485 | |||
| f258f8baeb | |||
| 6388daba56 | |||
| f6c3f2b460 | |||
| 156e655571 | |||
| b57e1ba304 | |||
| 19fba62f1b | |||
| b4ff4334cc | |||
| 5d9b00c8f2 | |||
| debf351c48 | |||
| d87d657275 | |||
| 1795eb64d1 | |||
| 355f0f918f | |||
| 5d3c86119e |
@@ -4,10 +4,7 @@
|
||||
"description": "Database Relations Specification Tool for Go",
|
||||
"language": "go"
|
||||
},
|
||||
"agent": {
|
||||
"preferred": "Explore",
|
||||
"description": "Use Explore agent for fast codebase navigation and Go project exploration"
|
||||
},
|
||||
|
||||
"codeStyle": {
|
||||
"useGofmt": true,
|
||||
"lineLength": 100,
|
||||
|
||||
5
.github/workflows/integration-tests.yml
vendored
5
.github/workflows/integration-tests.yml
vendored
@@ -46,6 +46,11 @@ jobs:
|
||||
- name: Download dependencies
|
||||
run: go mod download
|
||||
|
||||
- name: Install PostgreSQL client
|
||||
run: |
|
||||
sudo apt-get update
|
||||
sudo apt-get install -y postgresql-client
|
||||
|
||||
- name: Initialize test database
|
||||
env:
|
||||
PGPASSWORD: relspec_test_password
|
||||
|
||||
72
README.md
72
README.md
@@ -85,6 +85,29 @@ RelSpec includes a powerful schema validation and linting tool:
|
||||
## Use of AI
|
||||
[Rules and use of AI](./AI_USE.md)
|
||||
|
||||
## User Interface
|
||||
|
||||
RelSpec provides an interactive terminal-based user interface for managing and editing database schemas. The UI allows you to:
|
||||
|
||||
- **Browse Databases** - Navigate through your database structure with an intuitive menu system
|
||||
- **Edit Schemas** - Create, modify, and organize database schemas
|
||||
- **Manage Tables** - Add, update, or delete tables with full control over structure
|
||||
- **Configure Columns** - Define column properties, data types, constraints, and relationships
|
||||
- **Interactive Editing** - Real-time validation and feedback as you make changes
|
||||
|
||||
The interface supports multiple input formats, making it easy to load, edit, and save your database definitions in various formats.
|
||||
|
||||
<p align="center" width="100%">
|
||||
<img src="./assets/image/screenshots/main_screen.jpg">
|
||||
</p>
|
||||
<p align="center" width="100%">
|
||||
<img src="./assets/image/screenshots/table_view.jpg">
|
||||
</p>
|
||||
<p align="center" width="100%">
|
||||
<img src="./assets/image/screenshots/edit_column.jpg">
|
||||
</p>
|
||||
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
@@ -95,6 +118,55 @@ go install -v git.warky.dev/wdevs/relspecgo/cmd/relspec@latest
|
||||
|
||||
## Usage
|
||||
|
||||
### Interactive Schema Editor
|
||||
|
||||
```bash
|
||||
# Launch interactive editor with a DBML schema
|
||||
relspec edit --from dbml --from-path schema.dbml --to dbml --to-path schema.dbml
|
||||
|
||||
# Edit PostgreSQL database in place
|
||||
relspec edit --from pgsql --from-conn "postgres://user:pass@localhost/mydb" \
|
||||
--to pgsql --to-conn "postgres://user:pass@localhost/mydb"
|
||||
|
||||
# Edit JSON schema and save as GORM models
|
||||
relspec edit --from json --from-path db.json --to gorm --to-path models/
|
||||
```
|
||||
|
||||
The `edit` command launches an interactive terminal user interface where you can:
|
||||
- Browse and navigate your database structure
|
||||
- Create, modify, and delete schemas, tables, and columns
|
||||
- Configure column properties, constraints, and relationships
|
||||
- Save changes to various formats
|
||||
- Import and merge schemas from other databases
|
||||
|
||||
### Schema Merging
|
||||
|
||||
```bash
|
||||
# Merge two JSON schemas (additive merge - adds missing items only)
|
||||
relspec merge --target json --target-path base.json \
|
||||
--source json --source-path additions.json \
|
||||
--output json --output-path merged.json
|
||||
|
||||
# Merge PostgreSQL database into JSON, skipping specific tables
|
||||
relspec merge --target json --target-path current.json \
|
||||
--source pgsql --source-conn "postgres://user:pass@localhost/source_db" \
|
||||
--output json --output-path updated.json \
|
||||
--skip-tables "audit_log,temp_tables"
|
||||
|
||||
# Cross-format merge (DBML + YAML → JSON)
|
||||
relspec merge --target dbml --target-path base.dbml \
|
||||
--source yaml --source-path additions.yaml \
|
||||
--output json --output-path result.json \
|
||||
--skip-relations --skip-views
|
||||
```
|
||||
|
||||
The `merge` command combines two database schemas additively:
|
||||
- Adds missing schemas, tables, columns, and other objects
|
||||
- Never modifies or deletes existing items (safe operation)
|
||||
- Supports selective merging with skip options (domains, relations, enums, views, sequences, specific tables)
|
||||
- Works across any combination of supported formats
|
||||
- Perfect for integrating multiple schema definitions or applying patches
|
||||
|
||||
### Schema Conversion
|
||||
|
||||
```bash
|
||||
|
||||
11
TODO.md
11
TODO.md
@@ -22,6 +22,17 @@
|
||||
- [✔️] GraphQL schema generation
|
||||
|
||||
|
||||
## UI
|
||||
- [✔️] Basic UI (I went with tview)
|
||||
- [✔️] Save / Load Database
|
||||
- [✔️] Schemas / Domains / Tables
|
||||
- [ ] Add Relations
|
||||
- [ ] Add Indexes
|
||||
- [ ] Add Views
|
||||
- [ ] Add Sequences
|
||||
- [ ] Add Scripts
|
||||
- [ ] Domain / Table Assignment
|
||||
|
||||
## Documentation
|
||||
- [ ] API documentation (godoc)
|
||||
- [ ] Usage examples for each format combination
|
||||
|
||||
BIN
assets/image/screenshots/edit_column.jpg
Normal file
BIN
assets/image/screenshots/edit_column.jpg
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 42 KiB |
BIN
assets/image/screenshots/main_screen.jpg
Normal file
BIN
assets/image/screenshots/main_screen.jpg
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 50 KiB |
BIN
assets/image/screenshots/table_view.jpg
Normal file
BIN
assets/image/screenshots/table_view.jpg
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 67 KiB |
334
cmd/relspec/edit.go
Normal file
334
cmd/relspec/edit.go
Normal file
@@ -0,0 +1,334 @@
|
||||
package main
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"os"
|
||||
"strings"
|
||||
|
||||
"github.com/spf13/cobra"
|
||||
|
||||
"git.warky.dev/wdevs/relspecgo/pkg/models"
|
||||
"git.warky.dev/wdevs/relspecgo/pkg/readers"
|
||||
"git.warky.dev/wdevs/relspecgo/pkg/readers/bun"
|
||||
"git.warky.dev/wdevs/relspecgo/pkg/readers/dbml"
|
||||
"git.warky.dev/wdevs/relspecgo/pkg/readers/dctx"
|
||||
"git.warky.dev/wdevs/relspecgo/pkg/readers/drawdb"
|
||||
"git.warky.dev/wdevs/relspecgo/pkg/readers/drizzle"
|
||||
"git.warky.dev/wdevs/relspecgo/pkg/readers/gorm"
|
||||
"git.warky.dev/wdevs/relspecgo/pkg/readers/graphql"
|
||||
"git.warky.dev/wdevs/relspecgo/pkg/readers/json"
|
||||
"git.warky.dev/wdevs/relspecgo/pkg/readers/pgsql"
|
||||
"git.warky.dev/wdevs/relspecgo/pkg/readers/prisma"
|
||||
"git.warky.dev/wdevs/relspecgo/pkg/readers/typeorm"
|
||||
"git.warky.dev/wdevs/relspecgo/pkg/readers/yaml"
|
||||
"git.warky.dev/wdevs/relspecgo/pkg/ui"
|
||||
"git.warky.dev/wdevs/relspecgo/pkg/writers"
|
||||
wbun "git.warky.dev/wdevs/relspecgo/pkg/writers/bun"
|
||||
wdbml "git.warky.dev/wdevs/relspecgo/pkg/writers/dbml"
|
||||
wdctx "git.warky.dev/wdevs/relspecgo/pkg/writers/dctx"
|
||||
wdrawdb "git.warky.dev/wdevs/relspecgo/pkg/writers/drawdb"
|
||||
wdrizzle "git.warky.dev/wdevs/relspecgo/pkg/writers/drizzle"
|
||||
wgorm "git.warky.dev/wdevs/relspecgo/pkg/writers/gorm"
|
||||
wgraphql "git.warky.dev/wdevs/relspecgo/pkg/writers/graphql"
|
||||
wjson "git.warky.dev/wdevs/relspecgo/pkg/writers/json"
|
||||
wpgsql "git.warky.dev/wdevs/relspecgo/pkg/writers/pgsql"
|
||||
wprisma "git.warky.dev/wdevs/relspecgo/pkg/writers/prisma"
|
||||
wtypeorm "git.warky.dev/wdevs/relspecgo/pkg/writers/typeorm"
|
||||
wyaml "git.warky.dev/wdevs/relspecgo/pkg/writers/yaml"
|
||||
)
|
||||
|
||||
var (
|
||||
editSourceType string
|
||||
editSourcePath string
|
||||
editSourceConn string
|
||||
editTargetType string
|
||||
editTargetPath string
|
||||
editSchemaFilter string
|
||||
)
|
||||
|
||||
var editCmd = &cobra.Command{
|
||||
Use: "edit",
|
||||
Short: "Edit database schema interactively with TUI",
|
||||
Long: `Edit database schemas from various formats using an interactive terminal UI.
|
||||
|
||||
Allows you to:
|
||||
- List and navigate schemas and tables
|
||||
- Create, edit, and delete schemas
|
||||
- Create, edit, and delete tables
|
||||
- Add, edit, and delete columns
|
||||
- Set table and column properties
|
||||
- Add constraints, indexes, and relationships
|
||||
|
||||
Supports reading from and writing to all supported formats:
|
||||
Input formats:
|
||||
- dbml: DBML schema files
|
||||
- dctx: DCTX schema files
|
||||
- drawdb: DrawDB JSON files
|
||||
- graphql: GraphQL schema files (.graphql, SDL)
|
||||
- json: JSON database schema
|
||||
- yaml: YAML database schema
|
||||
- gorm: GORM model files (Go, file or directory)
|
||||
- bun: Bun model files (Go, file or directory)
|
||||
- drizzle: Drizzle ORM schema files (TypeScript, file or directory)
|
||||
- prisma: Prisma schema files (.prisma)
|
||||
- typeorm: TypeORM entity files (TypeScript)
|
||||
- pgsql: PostgreSQL database (live connection)
|
||||
|
||||
Output formats:
|
||||
- dbml: DBML schema files
|
||||
- dctx: DCTX schema files
|
||||
- drawdb: DrawDB JSON files
|
||||
- graphql: GraphQL schema files (.graphql, SDL)
|
||||
- json: JSON database schema
|
||||
- yaml: YAML database schema
|
||||
- gorm: GORM model files (Go)
|
||||
- bun: Bun model files (Go)
|
||||
- drizzle: Drizzle ORM schema files (TypeScript)
|
||||
- prisma: Prisma schema files (.prisma)
|
||||
- typeorm: TypeORM entity files (TypeScript)
|
||||
- pgsql: PostgreSQL SQL schema
|
||||
|
||||
PostgreSQL Connection String Examples:
|
||||
postgres://username:password@localhost:5432/database_name
|
||||
postgres://username:password@localhost/database_name
|
||||
postgresql://user:pass@host:5432/dbname?sslmode=disable
|
||||
postgresql://user:pass@host/dbname?sslmode=require
|
||||
host=localhost port=5432 user=username password=pass dbname=mydb sslmode=disable
|
||||
|
||||
Examples:
|
||||
# Edit a DBML schema file
|
||||
relspec edit --from dbml --from-path schema.dbml --to dbml --to-path schema.dbml
|
||||
|
||||
# Edit a PostgreSQL database
|
||||
relspec edit --from pgsql --from-conn "postgres://user:pass@localhost/mydb" \
|
||||
--to pgsql --to-conn "postgres://user:pass@localhost/mydb"
|
||||
|
||||
# Edit JSON schema and output to GORM
|
||||
relspec edit --from json --from-path db.json --to gorm --to-path models/
|
||||
|
||||
# Edit GORM models in place
|
||||
relspec edit --from gorm --from-path ./models --to gorm --to-path ./models`,
|
||||
RunE: runEdit,
|
||||
}
|
||||
|
||||
func init() {
|
||||
editCmd.Flags().StringVar(&editSourceType, "from", "", "Source format (dbml, dctx, drawdb, graphql, json, yaml, gorm, bun, drizzle, prisma, typeorm, pgsql)")
|
||||
editCmd.Flags().StringVar(&editSourcePath, "from-path", "", "Source file path (for file-based formats)")
|
||||
editCmd.Flags().StringVar(&editSourceConn, "from-conn", "", "Source connection string (for database formats)")
|
||||
editCmd.Flags().StringVar(&editTargetType, "to", "", "Target format (dbml, dctx, drawdb, graphql, json, yaml, gorm, bun, drizzle, prisma, typeorm, pgsql)")
|
||||
editCmd.Flags().StringVar(&editTargetPath, "to-path", "", "Target file path (for file-based formats)")
|
||||
editCmd.Flags().StringVar(&editSchemaFilter, "schema", "", "Filter to a specific schema by name")
|
||||
|
||||
// Flags are now optional - if not provided, UI will prompt for load/save options
|
||||
}
|
||||
|
||||
func runEdit(cmd *cobra.Command, args []string) error {
|
||||
fmt.Fprintf(os.Stderr, "\n=== RelSpec Schema Editor ===\n")
|
||||
fmt.Fprintf(os.Stderr, "Started at: %s\n\n", getCurrentTimestamp())
|
||||
|
||||
var db *models.Database
|
||||
var loadConfig *ui.LoadConfig
|
||||
var saveConfig *ui.SaveConfig
|
||||
var err error
|
||||
|
||||
// Check if source parameters are provided
|
||||
if editSourceType != "" {
|
||||
// Read source database
|
||||
fmt.Fprintf(os.Stderr, "[1/3] Reading source schema...\n")
|
||||
fmt.Fprintf(os.Stderr, " Format: %s\n", editSourceType)
|
||||
if editSourcePath != "" {
|
||||
fmt.Fprintf(os.Stderr, " Path: %s\n", editSourcePath)
|
||||
}
|
||||
if editSourceConn != "" {
|
||||
fmt.Fprintf(os.Stderr, " Conn: %s\n", maskPassword(editSourceConn))
|
||||
}
|
||||
|
||||
db, err = readDatabaseForEdit(editSourceType, editSourcePath, editSourceConn, "Source")
|
||||
if err != nil {
|
||||
return fmt.Errorf("failed to read source: %w", err)
|
||||
}
|
||||
|
||||
// Apply schema filter if specified
|
||||
if editSchemaFilter != "" {
|
||||
db = filterDatabaseBySchema(db, editSchemaFilter)
|
||||
}
|
||||
|
||||
fmt.Fprintf(os.Stderr, " ✓ Successfully read database '%s'\n", db.Name)
|
||||
fmt.Fprintf(os.Stderr, " Found: %d schema(s)\n", len(db.Schemas))
|
||||
|
||||
totalTables := 0
|
||||
for _, schema := range db.Schemas {
|
||||
totalTables += len(schema.Tables)
|
||||
}
|
||||
fmt.Fprintf(os.Stderr, " Found: %d table(s)\n\n", totalTables)
|
||||
|
||||
// Store load config
|
||||
loadConfig = &ui.LoadConfig{
|
||||
SourceType: editSourceType,
|
||||
FilePath: editSourcePath,
|
||||
ConnString: editSourceConn,
|
||||
}
|
||||
} else {
|
||||
// No source parameters provided, UI will show load screen
|
||||
fmt.Fprintf(os.Stderr, "[1/2] No source specified, editor will prompt for database\n\n")
|
||||
}
|
||||
|
||||
// Store save config if target parameters are provided
|
||||
if editTargetType != "" {
|
||||
saveConfig = &ui.SaveConfig{
|
||||
TargetType: editTargetType,
|
||||
FilePath: editTargetPath,
|
||||
}
|
||||
}
|
||||
|
||||
// Launch interactive TUI
|
||||
if editSourceType != "" {
|
||||
fmt.Fprintf(os.Stderr, "[2/3] Launching interactive editor...\n")
|
||||
} else {
|
||||
fmt.Fprintf(os.Stderr, "[2/2] Launching interactive editor...\n")
|
||||
}
|
||||
fmt.Fprintf(os.Stderr, " Use arrow keys and shortcuts to navigate\n")
|
||||
fmt.Fprintf(os.Stderr, " Press ? for help\n\n")
|
||||
|
||||
editor := ui.NewSchemaEditorWithConfigs(db, loadConfig, saveConfig)
|
||||
if err := editor.Run(); err != nil {
|
||||
return fmt.Errorf("editor failed: %w", err)
|
||||
}
|
||||
|
||||
// Only write to output if target parameters were provided and database was loaded from command line
|
||||
if editTargetType != "" && editSourceType != "" && db != nil {
|
||||
fmt.Fprintf(os.Stderr, "[3/3] Writing changes to output...\n")
|
||||
fmt.Fprintf(os.Stderr, " Format: %s\n", editTargetType)
|
||||
if editTargetPath != "" {
|
||||
fmt.Fprintf(os.Stderr, " Path: %s\n", editTargetPath)
|
||||
}
|
||||
|
||||
// Get the potentially modified database from the editor
|
||||
err = writeDatabaseForEdit(editTargetType, editTargetPath, "", editor.GetDatabase(), "Target")
|
||||
if err != nil {
|
||||
return fmt.Errorf("failed to write output: %w", err)
|
||||
}
|
||||
|
||||
fmt.Fprintf(os.Stderr, " ✓ Successfully written database\n")
|
||||
}
|
||||
|
||||
fmt.Fprintf(os.Stderr, "\n=== Edit complete ===\n")
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
func readDatabaseForEdit(dbType, filePath, connString, label string) (*models.Database, error) {
|
||||
var reader readers.Reader
|
||||
|
||||
switch strings.ToLower(dbType) {
|
||||
case "dbml":
|
||||
if filePath == "" {
|
||||
return nil, fmt.Errorf("%s: file path is required for DBML format", label)
|
||||
}
|
||||
reader = dbml.NewReader(&readers.ReaderOptions{FilePath: filePath})
|
||||
case "dctx":
|
||||
if filePath == "" {
|
||||
return nil, fmt.Errorf("%s: file path is required for DCTX format", label)
|
||||
}
|
||||
reader = dctx.NewReader(&readers.ReaderOptions{FilePath: filePath})
|
||||
case "drawdb":
|
||||
if filePath == "" {
|
||||
return nil, fmt.Errorf("%s: file path is required for DrawDB format", label)
|
||||
}
|
||||
reader = drawdb.NewReader(&readers.ReaderOptions{FilePath: filePath})
|
||||
case "graphql":
|
||||
if filePath == "" {
|
||||
return nil, fmt.Errorf("%s: file path is required for GraphQL format", label)
|
||||
}
|
||||
reader = graphql.NewReader(&readers.ReaderOptions{FilePath: filePath})
|
||||
case "json":
|
||||
if filePath == "" {
|
||||
return nil, fmt.Errorf("%s: file path is required for JSON format", label)
|
||||
}
|
||||
reader = json.NewReader(&readers.ReaderOptions{FilePath: filePath})
|
||||
case "yaml":
|
||||
if filePath == "" {
|
||||
return nil, fmt.Errorf("%s: file path is required for YAML format", label)
|
||||
}
|
||||
reader = yaml.NewReader(&readers.ReaderOptions{FilePath: filePath})
|
||||
case "gorm":
|
||||
if filePath == "" {
|
||||
return nil, fmt.Errorf("%s: file path is required for GORM format", label)
|
||||
}
|
||||
reader = gorm.NewReader(&readers.ReaderOptions{FilePath: filePath})
|
||||
case "bun":
|
||||
if filePath == "" {
|
||||
return nil, fmt.Errorf("%s: file path is required for Bun format", label)
|
||||
}
|
||||
reader = bun.NewReader(&readers.ReaderOptions{FilePath: filePath})
|
||||
case "drizzle":
|
||||
if filePath == "" {
|
||||
return nil, fmt.Errorf("%s: file path is required for Drizzle format", label)
|
||||
}
|
||||
reader = drizzle.NewReader(&readers.ReaderOptions{FilePath: filePath})
|
||||
case "prisma":
|
||||
if filePath == "" {
|
||||
return nil, fmt.Errorf("%s: file path is required for Prisma format", label)
|
||||
}
|
||||
reader = prisma.NewReader(&readers.ReaderOptions{FilePath: filePath})
|
||||
case "typeorm":
|
||||
if filePath == "" {
|
||||
return nil, fmt.Errorf("%s: file path is required for TypeORM format", label)
|
||||
}
|
||||
reader = typeorm.NewReader(&readers.ReaderOptions{FilePath: filePath})
|
||||
case "pgsql":
|
||||
if connString == "" {
|
||||
return nil, fmt.Errorf("%s: connection string is required for PostgreSQL format", label)
|
||||
}
|
||||
reader = pgsql.NewReader(&readers.ReaderOptions{ConnectionString: connString})
|
||||
default:
|
||||
return nil, fmt.Errorf("%s: unsupported format: %s", label, dbType)
|
||||
}
|
||||
|
||||
db, err := reader.ReadDatabase()
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("%s: %w", label, err)
|
||||
}
|
||||
|
||||
return db, nil
|
||||
}
|
||||
|
||||
func writeDatabaseForEdit(dbType, filePath, connString string, db *models.Database, label string) error {
|
||||
var writer writers.Writer
|
||||
|
||||
switch strings.ToLower(dbType) {
|
||||
case "dbml":
|
||||
writer = wdbml.NewWriter(&writers.WriterOptions{OutputPath: filePath})
|
||||
case "dctx":
|
||||
writer = wdctx.NewWriter(&writers.WriterOptions{OutputPath: filePath})
|
||||
case "drawdb":
|
||||
writer = wdrawdb.NewWriter(&writers.WriterOptions{OutputPath: filePath})
|
||||
case "graphql":
|
||||
writer = wgraphql.NewWriter(&writers.WriterOptions{OutputPath: filePath})
|
||||
case "json":
|
||||
writer = wjson.NewWriter(&writers.WriterOptions{OutputPath: filePath})
|
||||
case "yaml":
|
||||
writer = wyaml.NewWriter(&writers.WriterOptions{OutputPath: filePath})
|
||||
case "gorm":
|
||||
writer = wgorm.NewWriter(&writers.WriterOptions{OutputPath: filePath})
|
||||
case "bun":
|
||||
writer = wbun.NewWriter(&writers.WriterOptions{OutputPath: filePath})
|
||||
case "drizzle":
|
||||
writer = wdrizzle.NewWriter(&writers.WriterOptions{OutputPath: filePath})
|
||||
case "prisma":
|
||||
writer = wprisma.NewWriter(&writers.WriterOptions{OutputPath: filePath})
|
||||
case "typeorm":
|
||||
writer = wtypeorm.NewWriter(&writers.WriterOptions{OutputPath: filePath})
|
||||
case "pgsql":
|
||||
writer = wpgsql.NewWriter(&writers.WriterOptions{OutputPath: filePath})
|
||||
default:
|
||||
return fmt.Errorf("%s: unsupported format: %s", label, dbType)
|
||||
}
|
||||
|
||||
err := writer.WriteDatabase(db)
|
||||
if err != nil {
|
||||
return fmt.Errorf("%s: %w", label, err)
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
433
cmd/relspec/merge.go
Normal file
433
cmd/relspec/merge.go
Normal file
@@ -0,0 +1,433 @@
|
||||
package main
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"os"
|
||||
"path/filepath"
|
||||
"strings"
|
||||
|
||||
"github.com/spf13/cobra"
|
||||
|
||||
"git.warky.dev/wdevs/relspecgo/pkg/merge"
|
||||
"git.warky.dev/wdevs/relspecgo/pkg/models"
|
||||
"git.warky.dev/wdevs/relspecgo/pkg/readers"
|
||||
"git.warky.dev/wdevs/relspecgo/pkg/readers/bun"
|
||||
"git.warky.dev/wdevs/relspecgo/pkg/readers/dbml"
|
||||
"git.warky.dev/wdevs/relspecgo/pkg/readers/dctx"
|
||||
"git.warky.dev/wdevs/relspecgo/pkg/readers/drawdb"
|
||||
"git.warky.dev/wdevs/relspecgo/pkg/readers/drizzle"
|
||||
"git.warky.dev/wdevs/relspecgo/pkg/readers/gorm"
|
||||
"git.warky.dev/wdevs/relspecgo/pkg/readers/graphql"
|
||||
"git.warky.dev/wdevs/relspecgo/pkg/readers/json"
|
||||
"git.warky.dev/wdevs/relspecgo/pkg/readers/pgsql"
|
||||
"git.warky.dev/wdevs/relspecgo/pkg/readers/prisma"
|
||||
"git.warky.dev/wdevs/relspecgo/pkg/readers/typeorm"
|
||||
"git.warky.dev/wdevs/relspecgo/pkg/readers/yaml"
|
||||
"git.warky.dev/wdevs/relspecgo/pkg/writers"
|
||||
wbun "git.warky.dev/wdevs/relspecgo/pkg/writers/bun"
|
||||
wdbml "git.warky.dev/wdevs/relspecgo/pkg/writers/dbml"
|
||||
wdctx "git.warky.dev/wdevs/relspecgo/pkg/writers/dctx"
|
||||
wdrawdb "git.warky.dev/wdevs/relspecgo/pkg/writers/drawdb"
|
||||
wdrizzle "git.warky.dev/wdevs/relspecgo/pkg/writers/drizzle"
|
||||
wgorm "git.warky.dev/wdevs/relspecgo/pkg/writers/gorm"
|
||||
wgraphql "git.warky.dev/wdevs/relspecgo/pkg/writers/graphql"
|
||||
wjson "git.warky.dev/wdevs/relspecgo/pkg/writers/json"
|
||||
wpgsql "git.warky.dev/wdevs/relspecgo/pkg/writers/pgsql"
|
||||
wprisma "git.warky.dev/wdevs/relspecgo/pkg/writers/prisma"
|
||||
wtypeorm "git.warky.dev/wdevs/relspecgo/pkg/writers/typeorm"
|
||||
wyaml "git.warky.dev/wdevs/relspecgo/pkg/writers/yaml"
|
||||
)
|
||||
|
||||
var (
|
||||
mergeTargetType string
|
||||
mergeTargetPath string
|
||||
mergeTargetConn string
|
||||
mergeSourceType string
|
||||
mergeSourcePath string
|
||||
mergeSourceConn string
|
||||
mergeOutputType string
|
||||
mergeOutputPath string
|
||||
mergeOutputConn string
|
||||
mergeSkipDomains bool
|
||||
mergeSkipRelations bool
|
||||
mergeSkipEnums bool
|
||||
mergeSkipViews bool
|
||||
mergeSkipSequences bool
|
||||
mergeSkipTables string // Comma-separated table names to skip
|
||||
mergeVerbose bool
|
||||
)
|
||||
|
||||
var mergeCmd = &cobra.Command{
|
||||
Use: "merge",
|
||||
Short: "Merge database schemas (additive only - adds missing items)",
|
||||
Long: `Merge one database schema into another. Performs additive merging only:
|
||||
adds missing schemas, tables, columns, and other objects without modifying
|
||||
or deleting existing items.
|
||||
|
||||
The target database is loaded first, then the source database is merged into it.
|
||||
The result can be saved to a new format or updated in place.
|
||||
|
||||
Examples:
|
||||
# Merge two JSON schemas
|
||||
relspec merge --target json --target-path base.json \
|
||||
--source json --source-path additional.json \
|
||||
--output json --output-path merged.json
|
||||
|
||||
# Merge from PostgreSQL into JSON
|
||||
relspec merge --target json --target-path mydb.json \
|
||||
--source pgsql --source-conn "postgres://user:pass@localhost/source_db" \
|
||||
--output json --output-path combined.json
|
||||
|
||||
# Merge DBML and YAML, skip relations
|
||||
relspec merge --target dbml --target-path schema.dbml \
|
||||
--source yaml --source-path tables.yaml \
|
||||
--output dbml --output-path merged.dbml \
|
||||
--skip-relations
|
||||
|
||||
# Merge and save back to target format
|
||||
relspec merge --target json --target-path base.json \
|
||||
--source json --source-path patch.json \
|
||||
--output json --output-path base.json`,
|
||||
RunE: runMerge,
|
||||
}
|
||||
|
||||
func init() {
|
||||
// Target database flags
|
||||
mergeCmd.Flags().StringVar(&mergeTargetType, "target", "", "Target format (required): dbml, dctx, drawdb, graphql, json, yaml, gorm, bun, drizzle, prisma, typeorm, pgsql")
|
||||
mergeCmd.Flags().StringVar(&mergeTargetPath, "target-path", "", "Target file path (required for file-based formats)")
|
||||
mergeCmd.Flags().StringVar(&mergeTargetConn, "target-conn", "", "Target connection string (required for pgsql)")
|
||||
|
||||
// Source database flags
|
||||
mergeCmd.Flags().StringVar(&mergeSourceType, "source", "", "Source format (required): dbml, dctx, drawdb, graphql, json, yaml, gorm, bun, drizzle, prisma, typeorm, pgsql")
|
||||
mergeCmd.Flags().StringVar(&mergeSourcePath, "source-path", "", "Source file path (required for file-based formats)")
|
||||
mergeCmd.Flags().StringVar(&mergeSourceConn, "source-conn", "", "Source connection string (required for pgsql)")
|
||||
|
||||
// Output flags
|
||||
mergeCmd.Flags().StringVar(&mergeOutputType, "output", "", "Output format (required): dbml, dctx, drawdb, graphql, json, yaml, gorm, bun, drizzle, prisma, typeorm, pgsql")
|
||||
mergeCmd.Flags().StringVar(&mergeOutputPath, "output-path", "", "Output file path (required for file-based formats)")
|
||||
mergeCmd.Flags().StringVar(&mergeOutputConn, "output-conn", "", "Output connection string (for pgsql)")
|
||||
|
||||
// Merge options
|
||||
mergeCmd.Flags().BoolVar(&mergeSkipDomains, "skip-domains", false, "Skip domains during merge")
|
||||
mergeCmd.Flags().BoolVar(&mergeSkipRelations, "skip-relations", false, "Skip relations during merge")
|
||||
mergeCmd.Flags().BoolVar(&mergeSkipEnums, "skip-enums", false, "Skip enums during merge")
|
||||
mergeCmd.Flags().BoolVar(&mergeSkipViews, "skip-views", false, "Skip views during merge")
|
||||
mergeCmd.Flags().BoolVar(&mergeSkipSequences, "skip-sequences", false, "Skip sequences during merge")
|
||||
mergeCmd.Flags().StringVar(&mergeSkipTables, "skip-tables", "", "Comma-separated list of table names to skip during merge")
|
||||
mergeCmd.Flags().BoolVar(&mergeVerbose, "verbose", false, "Show verbose output")
|
||||
}
|
||||
|
||||
func runMerge(cmd *cobra.Command, args []string) error {
|
||||
fmt.Fprintf(os.Stderr, "\n=== RelSpec Merge ===\n")
|
||||
fmt.Fprintf(os.Stderr, "Started at: %s\n\n", getCurrentTimestamp())
|
||||
|
||||
// Validate required flags
|
||||
if mergeTargetType == "" {
|
||||
return fmt.Errorf("--target format is required")
|
||||
}
|
||||
if mergeSourceType == "" {
|
||||
return fmt.Errorf("--source format is required")
|
||||
}
|
||||
if mergeOutputType == "" {
|
||||
return fmt.Errorf("--output format is required")
|
||||
}
|
||||
|
||||
// Validate and expand file paths
|
||||
if mergeTargetType != "pgsql" {
|
||||
if mergeTargetPath == "" {
|
||||
return fmt.Errorf("--target-path is required for %s format", mergeTargetType)
|
||||
}
|
||||
mergeTargetPath = expandPath(mergeTargetPath)
|
||||
} else if mergeTargetConn == "" {
|
||||
|
||||
return fmt.Errorf("--target-conn is required for pgsql format")
|
||||
|
||||
}
|
||||
|
||||
if mergeSourceType != "pgsql" {
|
||||
if mergeSourcePath == "" {
|
||||
return fmt.Errorf("--source-path is required for %s format", mergeSourceType)
|
||||
}
|
||||
mergeSourcePath = expandPath(mergeSourcePath)
|
||||
} else if mergeSourceConn == "" {
|
||||
return fmt.Errorf("--source-conn is required for pgsql format")
|
||||
}
|
||||
|
||||
if mergeOutputType != "pgsql" {
|
||||
if mergeOutputPath == "" {
|
||||
return fmt.Errorf("--output-path is required for %s format", mergeOutputType)
|
||||
}
|
||||
mergeOutputPath = expandPath(mergeOutputPath)
|
||||
}
|
||||
|
||||
// Step 1: Read target database
|
||||
fmt.Fprintf(os.Stderr, "[1/3] Reading target database...\n")
|
||||
fmt.Fprintf(os.Stderr, " Format: %s\n", mergeTargetType)
|
||||
if mergeTargetPath != "" {
|
||||
fmt.Fprintf(os.Stderr, " Path: %s\n", mergeTargetPath)
|
||||
}
|
||||
if mergeTargetConn != "" {
|
||||
fmt.Fprintf(os.Stderr, " Conn: %s\n", maskPassword(mergeTargetConn))
|
||||
}
|
||||
|
||||
targetDB, err := readDatabaseForMerge(mergeTargetType, mergeTargetPath, mergeTargetConn, "Target")
|
||||
if err != nil {
|
||||
return fmt.Errorf("failed to read target database: %w", err)
|
||||
}
|
||||
fmt.Fprintf(os.Stderr, " ✓ Successfully read target database '%s'\n", targetDB.Name)
|
||||
printDatabaseStats(targetDB)
|
||||
|
||||
// Step 2: Read source database
|
||||
fmt.Fprintf(os.Stderr, "\n[2/3] Reading source database...\n")
|
||||
fmt.Fprintf(os.Stderr, " Format: %s\n", mergeSourceType)
|
||||
if mergeSourcePath != "" {
|
||||
fmt.Fprintf(os.Stderr, " Path: %s\n", mergeSourcePath)
|
||||
}
|
||||
if mergeSourceConn != "" {
|
||||
fmt.Fprintf(os.Stderr, " Conn: %s\n", maskPassword(mergeSourceConn))
|
||||
}
|
||||
|
||||
sourceDB, err := readDatabaseForMerge(mergeSourceType, mergeSourcePath, mergeSourceConn, "Source")
|
||||
if err != nil {
|
||||
return fmt.Errorf("failed to read source database: %w", err)
|
||||
}
|
||||
fmt.Fprintf(os.Stderr, " ✓ Successfully read source database '%s'\n", sourceDB.Name)
|
||||
printDatabaseStats(sourceDB)
|
||||
|
||||
// Step 3: Merge databases
|
||||
fmt.Fprintf(os.Stderr, "\n[3/3] Merging databases...\n")
|
||||
|
||||
opts := &merge.MergeOptions{
|
||||
SkipDomains: mergeSkipDomains,
|
||||
SkipRelations: mergeSkipRelations,
|
||||
SkipEnums: mergeSkipEnums,
|
||||
SkipViews: mergeSkipViews,
|
||||
SkipSequences: mergeSkipSequences,
|
||||
}
|
||||
|
||||
// Parse skip-tables flag
|
||||
if mergeSkipTables != "" {
|
||||
opts.SkipTableNames = parseSkipTables(mergeSkipTables)
|
||||
if len(opts.SkipTableNames) > 0 {
|
||||
fmt.Fprintf(os.Stderr, " Skipping tables: %s\n", mergeSkipTables)
|
||||
}
|
||||
}
|
||||
|
||||
result := merge.MergeDatabases(targetDB, sourceDB, opts)
|
||||
|
||||
// Update timestamp
|
||||
targetDB.UpdateDate()
|
||||
|
||||
// Print merge summary
|
||||
fmt.Fprintf(os.Stderr, " ✓ Merge complete\n\n")
|
||||
fmt.Fprintf(os.Stderr, "%s\n", merge.GetMergeSummary(result))
|
||||
|
||||
// Step 4: Write output
|
||||
fmt.Fprintf(os.Stderr, "\n[4/4] Writing output...\n")
|
||||
fmt.Fprintf(os.Stderr, " Format: %s\n", mergeOutputType)
|
||||
if mergeOutputPath != "" {
|
||||
fmt.Fprintf(os.Stderr, " Path: %s\n", mergeOutputPath)
|
||||
}
|
||||
|
||||
err = writeDatabaseForMerge(mergeOutputType, mergeOutputPath, "", targetDB, "Output")
|
||||
if err != nil {
|
||||
return fmt.Errorf("failed to write output: %w", err)
|
||||
}
|
||||
|
||||
fmt.Fprintf(os.Stderr, " ✓ Successfully written merged database\n")
|
||||
fmt.Fprintf(os.Stderr, "\n=== Merge complete ===\n")
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
func readDatabaseForMerge(dbType, filePath, connString, label string) (*models.Database, error) {
|
||||
var reader readers.Reader
|
||||
|
||||
switch strings.ToLower(dbType) {
|
||||
case "dbml":
|
||||
if filePath == "" {
|
||||
return nil, fmt.Errorf("%s: file path is required for DBML format", label)
|
||||
}
|
||||
reader = dbml.NewReader(&readers.ReaderOptions{FilePath: filePath})
|
||||
case "dctx":
|
||||
if filePath == "" {
|
||||
return nil, fmt.Errorf("%s: file path is required for DCTX format", label)
|
||||
}
|
||||
reader = dctx.NewReader(&readers.ReaderOptions{FilePath: filePath})
|
||||
case "drawdb":
|
||||
if filePath == "" {
|
||||
return nil, fmt.Errorf("%s: file path is required for DrawDB format", label)
|
||||
}
|
||||
reader = drawdb.NewReader(&readers.ReaderOptions{FilePath: filePath})
|
||||
case "graphql":
|
||||
if filePath == "" {
|
||||
return nil, fmt.Errorf("%s: file path is required for GraphQL format", label)
|
||||
}
|
||||
reader = graphql.NewReader(&readers.ReaderOptions{FilePath: filePath})
|
||||
case "json":
|
||||
if filePath == "" {
|
||||
return nil, fmt.Errorf("%s: file path is required for JSON format", label)
|
||||
}
|
||||
reader = json.NewReader(&readers.ReaderOptions{FilePath: filePath})
|
||||
case "yaml":
|
||||
if filePath == "" {
|
||||
return nil, fmt.Errorf("%s: file path is required for YAML format", label)
|
||||
}
|
||||
reader = yaml.NewReader(&readers.ReaderOptions{FilePath: filePath})
|
||||
case "gorm":
|
||||
if filePath == "" {
|
||||
return nil, fmt.Errorf("%s: file path is required for GORM format", label)
|
||||
}
|
||||
reader = gorm.NewReader(&readers.ReaderOptions{FilePath: filePath})
|
||||
case "bun":
|
||||
if filePath == "" {
|
||||
return nil, fmt.Errorf("%s: file path is required for Bun format", label)
|
||||
}
|
||||
reader = bun.NewReader(&readers.ReaderOptions{FilePath: filePath})
|
||||
case "drizzle":
|
||||
if filePath == "" {
|
||||
return nil, fmt.Errorf("%s: file path is required for Drizzle format", label)
|
||||
}
|
||||
reader = drizzle.NewReader(&readers.ReaderOptions{FilePath: filePath})
|
||||
case "prisma":
|
||||
if filePath == "" {
|
||||
return nil, fmt.Errorf("%s: file path is required for Prisma format", label)
|
||||
}
|
||||
reader = prisma.NewReader(&readers.ReaderOptions{FilePath: filePath})
|
||||
case "typeorm":
|
||||
if filePath == "" {
|
||||
return nil, fmt.Errorf("%s: file path is required for TypeORM format", label)
|
||||
}
|
||||
reader = typeorm.NewReader(&readers.ReaderOptions{FilePath: filePath})
|
||||
case "pgsql":
|
||||
if connString == "" {
|
||||
return nil, fmt.Errorf("%s: connection string is required for PostgreSQL format", label)
|
||||
}
|
||||
reader = pgsql.NewReader(&readers.ReaderOptions{ConnectionString: connString})
|
||||
default:
|
||||
return nil, fmt.Errorf("%s: unsupported format '%s'", label, dbType)
|
||||
}
|
||||
|
||||
db, err := reader.ReadDatabase()
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
return db, nil
|
||||
}
|
||||
|
||||
func writeDatabaseForMerge(dbType, filePath, connString string, db *models.Database, label string) error {
|
||||
var writer writers.Writer
|
||||
|
||||
switch strings.ToLower(dbType) {
|
||||
case "dbml":
|
||||
if filePath == "" {
|
||||
return fmt.Errorf("%s: file path is required for DBML format", label)
|
||||
}
|
||||
writer = wdbml.NewWriter(&writers.WriterOptions{OutputPath: filePath})
|
||||
case "dctx":
|
||||
if filePath == "" {
|
||||
return fmt.Errorf("%s: file path is required for DCTX format", label)
|
||||
}
|
||||
writer = wdctx.NewWriter(&writers.WriterOptions{OutputPath: filePath})
|
||||
case "drawdb":
|
||||
if filePath == "" {
|
||||
return fmt.Errorf("%s: file path is required for DrawDB format", label)
|
||||
}
|
||||
writer = wdrawdb.NewWriter(&writers.WriterOptions{OutputPath: filePath})
|
||||
case "graphql":
|
||||
if filePath == "" {
|
||||
return fmt.Errorf("%s: file path is required for GraphQL format", label)
|
||||
}
|
||||
writer = wgraphql.NewWriter(&writers.WriterOptions{OutputPath: filePath})
|
||||
case "json":
|
||||
if filePath == "" {
|
||||
return fmt.Errorf("%s: file path is required for JSON format", label)
|
||||
}
|
||||
writer = wjson.NewWriter(&writers.WriterOptions{OutputPath: filePath})
|
||||
case "yaml":
|
||||
if filePath == "" {
|
||||
return fmt.Errorf("%s: file path is required for YAML format", label)
|
||||
}
|
||||
writer = wyaml.NewWriter(&writers.WriterOptions{OutputPath: filePath})
|
||||
case "gorm":
|
||||
if filePath == "" {
|
||||
return fmt.Errorf("%s: file path is required for GORM format", label)
|
||||
}
|
||||
writer = wgorm.NewWriter(&writers.WriterOptions{OutputPath: filePath})
|
||||
case "bun":
|
||||
if filePath == "" {
|
||||
return fmt.Errorf("%s: file path is required for Bun format", label)
|
||||
}
|
||||
writer = wbun.NewWriter(&writers.WriterOptions{OutputPath: filePath})
|
||||
case "drizzle":
|
||||
if filePath == "" {
|
||||
return fmt.Errorf("%s: file path is required for Drizzle format", label)
|
||||
}
|
||||
writer = wdrizzle.NewWriter(&writers.WriterOptions{OutputPath: filePath})
|
||||
case "prisma":
|
||||
if filePath == "" {
|
||||
return fmt.Errorf("%s: file path is required for Prisma format", label)
|
||||
}
|
||||
writer = wprisma.NewWriter(&writers.WriterOptions{OutputPath: filePath})
|
||||
case "typeorm":
|
||||
if filePath == "" {
|
||||
return fmt.Errorf("%s: file path is required for TypeORM format", label)
|
||||
}
|
||||
writer = wtypeorm.NewWriter(&writers.WriterOptions{OutputPath: filePath})
|
||||
case "pgsql":
|
||||
writer = wpgsql.NewWriter(&writers.WriterOptions{OutputPath: filePath})
|
||||
default:
|
||||
return fmt.Errorf("%s: unsupported format '%s'", label, dbType)
|
||||
}
|
||||
|
||||
return writer.WriteDatabase(db)
|
||||
}
|
||||
|
||||
func expandPath(path string) string {
|
||||
if len(path) > 0 && path[0] == '~' {
|
||||
home, err := os.UserHomeDir()
|
||||
if err == nil {
|
||||
return filepath.Join(home, path[1:])
|
||||
}
|
||||
}
|
||||
return path
|
||||
}
|
||||
|
||||
func printDatabaseStats(db *models.Database) {
|
||||
totalTables := 0
|
||||
totalColumns := 0
|
||||
totalConstraints := 0
|
||||
totalIndexes := 0
|
||||
|
||||
for _, schema := range db.Schemas {
|
||||
totalTables += len(schema.Tables)
|
||||
for _, table := range schema.Tables {
|
||||
totalColumns += len(table.Columns)
|
||||
totalConstraints += len(table.Constraints)
|
||||
totalIndexes += len(table.Indexes)
|
||||
}
|
||||
}
|
||||
|
||||
fmt.Fprintf(os.Stderr, " Schemas: %d, Tables: %d, Columns: %d, Constraints: %d, Indexes: %d\n",
|
||||
len(db.Schemas), totalTables, totalColumns, totalConstraints, totalIndexes)
|
||||
}
|
||||
|
||||
func parseSkipTables(skipTablesStr string) map[string]bool {
|
||||
skipTables := make(map[string]bool)
|
||||
if skipTablesStr == "" {
|
||||
return skipTables
|
||||
}
|
||||
|
||||
// Split by comma and trim whitespace
|
||||
parts := strings.Split(skipTablesStr, ",")
|
||||
for _, part := range parts {
|
||||
trimmed := strings.TrimSpace(part)
|
||||
if trimmed != "" {
|
||||
// Store in lowercase for case-insensitive matching
|
||||
skipTables[strings.ToLower(trimmed)] = true
|
||||
}
|
||||
}
|
||||
|
||||
return skipTables
|
||||
}
|
||||
@@ -21,4 +21,7 @@ func init() {
|
||||
rootCmd.AddCommand(inspectCmd)
|
||||
rootCmd.AddCommand(scriptsCmd)
|
||||
rootCmd.AddCommand(templCmd)
|
||||
rootCmd.AddCommand(editCmd)
|
||||
rootCmd.AddCommand(mergeCmd)
|
||||
rootCmd.AddCommand(splitCmd)
|
||||
}
|
||||
|
||||
318
cmd/relspec/split.go
Normal file
318
cmd/relspec/split.go
Normal file
@@ -0,0 +1,318 @@
|
||||
package main
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"os"
|
||||
"strings"
|
||||
|
||||
"github.com/spf13/cobra"
|
||||
|
||||
"git.warky.dev/wdevs/relspecgo/pkg/models"
|
||||
)
|
||||
|
||||
var (
|
||||
splitSourceType string
|
||||
splitSourcePath string
|
||||
splitSourceConn string
|
||||
splitTargetType string
|
||||
splitTargetPath string
|
||||
splitSchemas string
|
||||
splitTables string
|
||||
splitPackageName string
|
||||
splitDatabaseName string
|
||||
splitExcludeSchema string
|
||||
splitExcludeTables string
|
||||
)
|
||||
|
||||
var splitCmd = &cobra.Command{
|
||||
Use: "split",
|
||||
Short: "Split database schemas to extract selected tables into a separate database",
|
||||
Long: `Extract selected schemas and tables from a database and write them to a separate output.
|
||||
|
||||
The split command allows you to:
|
||||
- Select specific schemas to include in the output
|
||||
- Select specific tables within schemas
|
||||
- Exclude specific schemas or tables if preferred
|
||||
- Export the selected subset to any supported format
|
||||
|
||||
Input formats:
|
||||
- dbml: DBML schema files
|
||||
- dctx: DCTX schema files
|
||||
- drawdb: DrawDB JSON files
|
||||
- graphql: GraphQL schema files (.graphql, SDL)
|
||||
- json: JSON database schema
|
||||
- yaml: YAML database schema
|
||||
- gorm: GORM model files (Go, file or directory)
|
||||
- bun: Bun model files (Go, file or directory)
|
||||
- drizzle: Drizzle ORM schema files (TypeScript, file or directory)
|
||||
- prisma: Prisma schema files (.prisma)
|
||||
- typeorm: TypeORM entity files (TypeScript)
|
||||
- pgsql: PostgreSQL database (live connection)
|
||||
|
||||
Output formats:
|
||||
- dbml: DBML schema files
|
||||
- dctx: DCTX schema files
|
||||
- drawdb: DrawDB JSON files
|
||||
- graphql: GraphQL schema files (.graphql, SDL)
|
||||
- json: JSON database schema
|
||||
- yaml: YAML database schema
|
||||
- gorm: GORM model files (Go)
|
||||
- bun: Bun model files (Go)
|
||||
- drizzle: Drizzle ORM schema files (TypeScript)
|
||||
- prisma: Prisma schema files (.prisma)
|
||||
- typeorm: TypeORM entity files (TypeScript)
|
||||
- pgsql: PostgreSQL SQL schema
|
||||
|
||||
Examples:
|
||||
# Split specific schemas from DBML
|
||||
relspec split --from dbml --from-path schema.dbml \
|
||||
--schemas public,auth \
|
||||
--to json --to-path subset.json
|
||||
|
||||
# Extract specific tables from PostgreSQL
|
||||
relspec split --from pgsql \
|
||||
--from-conn "postgres://user:pass@localhost:5432/mydb" \
|
||||
--schemas public \
|
||||
--tables users,orders,products \
|
||||
--to dbml --to-path subset.dbml
|
||||
|
||||
# Exclude specific tables
|
||||
relspec split --from json --from-path schema.json \
|
||||
--exclude-tables "audit_log,system_config,temp_data" \
|
||||
--to json --to-path public_schema.json
|
||||
|
||||
# Split and convert to GORM
|
||||
relspec split --from json --from-path schema.json \
|
||||
--tables "users,posts,comments" \
|
||||
--to gorm --to-path models/ --package models \
|
||||
--database-name MyAppDB
|
||||
|
||||
# Exclude specific schema and tables
|
||||
relspec split --from pgsql \
|
||||
--from-conn "postgres://user:pass@localhost/db" \
|
||||
--exclude-schema pg_catalog,information_schema \
|
||||
--exclude-tables "temp_users,debug_logs" \
|
||||
--to json --to-path public_schema.json`,
|
||||
RunE: runSplit,
|
||||
}
|
||||
|
||||
func init() {
|
||||
splitCmd.Flags().StringVar(&splitSourceType, "from", "", "Source format (dbml, dctx, drawdb, graphql, json, yaml, gorm, bun, drizzle, prisma, typeorm, pgsql)")
|
||||
splitCmd.Flags().StringVar(&splitSourcePath, "from-path", "", "Source file path (for file-based formats)")
|
||||
splitCmd.Flags().StringVar(&splitSourceConn, "from-conn", "", "Source connection string (for database formats)")
|
||||
|
||||
splitCmd.Flags().StringVar(&splitTargetType, "to", "", "Target format (dbml, dctx, drawdb, graphql, json, yaml, gorm, bun, drizzle, prisma, typeorm, pgsql)")
|
||||
splitCmd.Flags().StringVar(&splitTargetPath, "to-path", "", "Target output path (file or directory)")
|
||||
splitCmd.Flags().StringVar(&splitPackageName, "package", "", "Package name (for code generation formats like gorm/bun)")
|
||||
splitCmd.Flags().StringVar(&splitDatabaseName, "database-name", "", "Override database name in output")
|
||||
|
||||
splitCmd.Flags().StringVar(&splitSchemas, "schemas", "", "Comma-separated list of schema names to include")
|
||||
splitCmd.Flags().StringVar(&splitTables, "tables", "", "Comma-separated list of table names to include (case-insensitive)")
|
||||
splitCmd.Flags().StringVar(&splitExcludeSchema, "exclude-schema", "", "Comma-separated list of schema names to exclude")
|
||||
splitCmd.Flags().StringVar(&splitExcludeTables, "exclude-tables", "", "Comma-separated list of table names to exclude (case-insensitive)")
|
||||
|
||||
err := splitCmd.MarkFlagRequired("from")
|
||||
if err != nil {
|
||||
fmt.Fprintf(os.Stderr, "Error marking from flag as required: %v\n", err)
|
||||
}
|
||||
err = splitCmd.MarkFlagRequired("to")
|
||||
if err != nil {
|
||||
fmt.Fprintf(os.Stderr, "Error marking to flag as required: %v\n", err)
|
||||
}
|
||||
err = splitCmd.MarkFlagRequired("to-path")
|
||||
if err != nil {
|
||||
fmt.Fprintf(os.Stderr, "Error marking to-path flag as required: %v\n", err)
|
||||
}
|
||||
}
|
||||
|
||||
func runSplit(cmd *cobra.Command, args []string) error {
|
||||
fmt.Fprintf(os.Stderr, "\n=== RelSpec Schema Split ===\n")
|
||||
fmt.Fprintf(os.Stderr, "Started at: %s\n\n", getCurrentTimestamp())
|
||||
|
||||
// Read source database
|
||||
fmt.Fprintf(os.Stderr, "[1/3] Reading source schema...\n")
|
||||
fmt.Fprintf(os.Stderr, " Format: %s\n", splitSourceType)
|
||||
if splitSourcePath != "" {
|
||||
fmt.Fprintf(os.Stderr, " Path: %s\n", splitSourcePath)
|
||||
}
|
||||
if splitSourceConn != "" {
|
||||
fmt.Fprintf(os.Stderr, " Conn: %s\n", maskPassword(splitSourceConn))
|
||||
}
|
||||
|
||||
db, err := readDatabaseForConvert(splitSourceType, splitSourcePath, splitSourceConn)
|
||||
if err != nil {
|
||||
return fmt.Errorf("failed to read source: %w", err)
|
||||
}
|
||||
|
||||
fmt.Fprintf(os.Stderr, " ✓ Successfully read database '%s'\n", db.Name)
|
||||
fmt.Fprintf(os.Stderr, " Found: %d schema(s)\n", len(db.Schemas))
|
||||
|
||||
totalTables := 0
|
||||
for _, schema := range db.Schemas {
|
||||
totalTables += len(schema.Tables)
|
||||
}
|
||||
fmt.Fprintf(os.Stderr, " Found: %d table(s)\n\n", totalTables)
|
||||
|
||||
// Filter the database
|
||||
fmt.Fprintf(os.Stderr, "[2/3] Filtering schemas and tables...\n")
|
||||
filteredDB, err := filterDatabase(db)
|
||||
if err != nil {
|
||||
return fmt.Errorf("failed to filter database: %w", err)
|
||||
}
|
||||
|
||||
if splitDatabaseName != "" {
|
||||
filteredDB.Name = splitDatabaseName
|
||||
}
|
||||
|
||||
filteredTables := 0
|
||||
for _, schema := range filteredDB.Schemas {
|
||||
filteredTables += len(schema.Tables)
|
||||
}
|
||||
fmt.Fprintf(os.Stderr, " ✓ Filtered to: %d schema(s), %d table(s)\n\n", len(filteredDB.Schemas), filteredTables)
|
||||
|
||||
// Write to target format
|
||||
fmt.Fprintf(os.Stderr, "[3/3] Writing to target format...\n")
|
||||
fmt.Fprintf(os.Stderr, " Format: %s\n", splitTargetType)
|
||||
fmt.Fprintf(os.Stderr, " Output: %s\n", splitTargetPath)
|
||||
if splitPackageName != "" {
|
||||
fmt.Fprintf(os.Stderr, " Package: %s\n", splitPackageName)
|
||||
}
|
||||
|
||||
err = writeDatabase(
|
||||
filteredDB,
|
||||
splitTargetType,
|
||||
splitTargetPath,
|
||||
splitPackageName,
|
||||
"", // no schema filter for split
|
||||
)
|
||||
if err != nil {
|
||||
return fmt.Errorf("failed to write output: %w", err)
|
||||
}
|
||||
|
||||
fmt.Fprintf(os.Stderr, " ✓ Successfully written to '%s'\n\n", splitTargetPath)
|
||||
fmt.Fprintf(os.Stderr, "=== Split Completed Successfully ===\n")
|
||||
fmt.Fprintf(os.Stderr, "Completed at: %s\n\n", getCurrentTimestamp())
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
// filterDatabase filters the database based on provided criteria
|
||||
func filterDatabase(db *models.Database) (*models.Database, error) {
|
||||
filteredDB := &models.Database{
|
||||
Name: db.Name,
|
||||
Description: db.Description,
|
||||
Comment: db.Comment,
|
||||
DatabaseType: db.DatabaseType,
|
||||
DatabaseVersion: db.DatabaseVersion,
|
||||
SourceFormat: db.SourceFormat,
|
||||
UpdatedAt: db.UpdatedAt,
|
||||
GUID: db.GUID,
|
||||
Schemas: []*models.Schema{},
|
||||
Domains: db.Domains, // Keep domains for now
|
||||
}
|
||||
|
||||
// Parse filter flags
|
||||
includeSchemas := parseCommaSeparated(splitSchemas)
|
||||
includeTables := parseCommaSeparated(splitTables)
|
||||
excludeSchemas := parseCommaSeparated(splitExcludeSchema)
|
||||
excludeTables := parseCommaSeparated(splitExcludeTables)
|
||||
|
||||
// Convert table names to lowercase for case-insensitive matching
|
||||
includeTablesLower := make(map[string]bool)
|
||||
for _, t := range includeTables {
|
||||
includeTablesLower[strings.ToLower(t)] = true
|
||||
}
|
||||
|
||||
excludeTablesLower := make(map[string]bool)
|
||||
for _, t := range excludeTables {
|
||||
excludeTablesLower[strings.ToLower(t)] = true
|
||||
}
|
||||
|
||||
// Iterate through schemas
|
||||
for _, schema := range db.Schemas {
|
||||
// Check if schema should be excluded
|
||||
if contains(excludeSchemas, schema.Name) {
|
||||
continue
|
||||
}
|
||||
|
||||
// Check if schema should be included
|
||||
if len(includeSchemas) > 0 && !contains(includeSchemas, schema.Name) {
|
||||
continue
|
||||
}
|
||||
|
||||
// Create a copy of the schema with filtered tables
|
||||
filteredSchema := &models.Schema{
|
||||
Name: schema.Name,
|
||||
Description: schema.Description,
|
||||
Owner: schema.Owner,
|
||||
Permissions: schema.Permissions,
|
||||
Comment: schema.Comment,
|
||||
Metadata: schema.Metadata,
|
||||
Scripts: schema.Scripts,
|
||||
Sequence: schema.Sequence,
|
||||
Relations: schema.Relations,
|
||||
Enums: schema.Enums,
|
||||
UpdatedAt: schema.UpdatedAt,
|
||||
GUID: schema.GUID,
|
||||
Tables: []*models.Table{},
|
||||
Views: schema.Views,
|
||||
Sequences: schema.Sequences,
|
||||
}
|
||||
|
||||
// Filter tables within the schema
|
||||
for _, table := range schema.Tables {
|
||||
tableLower := strings.ToLower(table.Name)
|
||||
|
||||
// Check if table should be excluded
|
||||
if excludeTablesLower[tableLower] {
|
||||
continue
|
||||
}
|
||||
|
||||
// If specific tables are requested, only include those
|
||||
if len(includeTablesLower) > 0 {
|
||||
if !includeTablesLower[tableLower] {
|
||||
continue
|
||||
}
|
||||
}
|
||||
filteredSchema.Tables = append(filteredSchema.Tables, table)
|
||||
}
|
||||
|
||||
// Only add schema if it has tables (unless no table filter was specified)
|
||||
if len(filteredSchema.Tables) > 0 || (len(includeTablesLower) == 0 && len(excludeTablesLower) == 0) {
|
||||
filteredDB.Schemas = append(filteredDB.Schemas, filteredSchema)
|
||||
}
|
||||
}
|
||||
|
||||
if len(filteredDB.Schemas) == 0 {
|
||||
return nil, fmt.Errorf("no schemas matched the filter criteria")
|
||||
}
|
||||
|
||||
return filteredDB, nil
|
||||
}
|
||||
|
||||
// parseCommaSeparated parses a comma-separated string into a slice, trimming whitespace
|
||||
func parseCommaSeparated(s string) []string {
|
||||
if s == "" {
|
||||
return []string{}
|
||||
}
|
||||
|
||||
parts := strings.Split(s, ",")
|
||||
result := make([]string, 0, len(parts))
|
||||
for _, p := range parts {
|
||||
trimmed := strings.TrimSpace(p)
|
||||
if trimmed != "" {
|
||||
result = append(result, trimmed)
|
||||
}
|
||||
}
|
||||
return result
|
||||
}
|
||||
|
||||
// contains checks if a string is in a slice
|
||||
func contains(slice []string, item string) bool {
|
||||
for _, s := range slice {
|
||||
if s == item {
|
||||
return true
|
||||
}
|
||||
}
|
||||
return false
|
||||
}
|
||||
149
docs/DOMAINS_DRAWDB.md
Normal file
149
docs/DOMAINS_DRAWDB.md
Normal file
@@ -0,0 +1,149 @@
|
||||
# Domains and DrawDB Areas Integration
|
||||
|
||||
## Overview
|
||||
|
||||
Domains provide a way to organize tables from potentially multiple schemas into logical business groupings. When working with DrawDB format, domains are automatically imported/exported as **Subject Areas** - a native DrawDB feature for visually grouping tables.
|
||||
|
||||
## How It Works
|
||||
|
||||
### Writing Domains to DrawDB (Export)
|
||||
|
||||
When you export a database with domains to DrawDB format:
|
||||
|
||||
1. **Schema Areas** are created automatically for each schema (existing behavior)
|
||||
2. **Domain Areas** are created for each domain, calculated based on the positions of the tables they contain
|
||||
3. The domain area bounds are automatically calculated to encompass all its tables with a small padding
|
||||
|
||||
```go
|
||||
// Example: Creating a domain and exporting to DrawDB
|
||||
db := models.InitDatabase("mydb")
|
||||
|
||||
// Create an "authentication" domain
|
||||
authDomain := models.InitDomain("authentication")
|
||||
authDomain.Tables = append(authDomain.Tables,
|
||||
models.InitDomainTable("users", "public"),
|
||||
models.InitDomainTable("roles", "public"),
|
||||
models.InitDomainTable("permissions", "public"),
|
||||
)
|
||||
db.Domains = append(db.Domains, authDomain)
|
||||
|
||||
// Create a "financial" domain spanning multiple schemas
|
||||
finDomain := models.InitDomain("financial")
|
||||
finDomain.Tables = append(finDomain.Tables,
|
||||
models.InitDomainTable("accounts", "public"),
|
||||
models.InitDomainTable("transactions", "public"),
|
||||
models.InitDomainTable("ledger", "finance"), // Different schema!
|
||||
)
|
||||
db.Domains = append(db.Domains, finDomain)
|
||||
|
||||
// Write to DrawDB - domains become subject areas
|
||||
writer := drawdb.NewWriter(&writers.WriterOptions{
|
||||
OutputPath: "schema.json",
|
||||
})
|
||||
writer.WriteDatabase(db)
|
||||
```
|
||||
|
||||
The resulting DrawDB JSON will have Subject Areas for both:
|
||||
- "authentication" area containing the auth tables
|
||||
- "financial" area containing the financial tables from both schemas
|
||||
|
||||
### Reading Domains from DrawDB (Import)
|
||||
|
||||
When you import a DrawDB file with Subject Areas:
|
||||
|
||||
1. **Subject Areas** are automatically converted to **Domains**
|
||||
2. Tables are assigned to a domain if they fall within the area's visual bounds
|
||||
3. Table references include both the table name and schema name
|
||||
|
||||
```go
|
||||
// Example: Reading DrawDB with areas
|
||||
reader := drawdb.NewReader(&readers.ReaderOptions{
|
||||
FilePath: "schema.json",
|
||||
})
|
||||
|
||||
db, err := reader.ReadDatabase()
|
||||
if err != nil {
|
||||
log.Fatal(err)
|
||||
}
|
||||
|
||||
// Access domains
|
||||
for _, domain := range db.Domains {
|
||||
fmt.Printf("Domain: %s\n", domain.Name)
|
||||
for _, domainTable := range domain.Tables {
|
||||
fmt.Printf(" - %s.%s\n", domainTable.SchemaName, domainTable.TableName)
|
||||
|
||||
// Access the actual table reference if loaded
|
||||
if domainTable.RefTable != nil {
|
||||
fmt.Printf(" Description: %s\n", domainTable.RefTable.Description)
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Domain Structure
|
||||
|
||||
```go
|
||||
type Domain struct {
|
||||
Name string // Domain name (e.g., "authentication", "user_data")
|
||||
Description string // Optional human-readable description
|
||||
Tables []*DomainTable // Tables belonging to this domain
|
||||
Comment string // Optional comment
|
||||
Metadata map[string]any // Extensible metadata
|
||||
Sequence uint // Ordering hint
|
||||
}
|
||||
|
||||
type DomainTable struct {
|
||||
TableName string // Table name
|
||||
SchemaName string // Schema containing the table
|
||||
Sequence uint // Ordering hint
|
||||
RefTable *Table // Pointer to actual table (in-memory only, not serialized)
|
||||
}
|
||||
```
|
||||
|
||||
## Multi-Schema Domains
|
||||
|
||||
One of the key features of domains is that they can span multiple schemas:
|
||||
|
||||
```
|
||||
Domain: "user_data"
|
||||
├── public.users
|
||||
├── public.profiles
|
||||
├── public.user_preferences
|
||||
├── auth.user_sessions
|
||||
└── auth.mfa_devices
|
||||
```
|
||||
|
||||
This allows you to organize related tables even when they're stored in different schemas.
|
||||
|
||||
## Visual Organization in DrawDB
|
||||
|
||||
When viewing the exported DrawDB file in DrawDB Editor:
|
||||
|
||||
1. **Schema areas** appear in one color (original behavior)
|
||||
2. **Domain areas** appear in a different color
|
||||
3. Domain area bounds are calculated to fit all contained tables
|
||||
4. Areas can overlap - a table can visually belong to multiple areas
|
||||
|
||||
## Integration with Other Formats
|
||||
|
||||
Currently, domain/area integration is implemented for DrawDB format.
|
||||
|
||||
To implement similar functionality for other formats:
|
||||
|
||||
1. Identify if the format has a native grouping/area feature
|
||||
2. Add conversion logic in the reader to map format areas → Domain model
|
||||
3. Add conversion logic in the writer to map Domain model → format areas
|
||||
|
||||
Example formats that could support domains:
|
||||
- **DBML**: Could use DBML's `TableGroup` feature
|
||||
- **DrawDB**: ✅ Already implemented (Subject Areas)
|
||||
- **GraphQL**: Could use schema directives
|
||||
- **Custom formats**: Implement as needed
|
||||
|
||||
## Tips and Best Practices
|
||||
|
||||
1. **Keep domains focused**: Each domain should represent a distinct business area
|
||||
2. **Document purposes**: Use Description and Comment fields to explain each domain
|
||||
3. **Use meaningful names**: Domain names should clearly reflect their purpose
|
||||
4. **Maintain schema consistency**: Keep related tables together in the same schema when possible
|
||||
5. **Use metadata**: Store tool-specific information in the Metadata field
|
||||
@@ -44,12 +44,13 @@ The `--mode` flag controls how the template is executed:
|
||||
|------|-------------|--------|-------------|
|
||||
| `database` | Execute once for entire database | Single file | Documentation, reports, overview files |
|
||||
| `schema` | Execute once per schema | One file per schema | Schema-specific documentation |
|
||||
| `domain` | Execute once per domain | One file per domain | Domain-based documentation, domain exports |
|
||||
| `script` | Execute once per script | One file per script | Script processing |
|
||||
| `table` | Execute once per table | One file per table | Model generation, table docs |
|
||||
|
||||
### Filename Patterns
|
||||
|
||||
For multi-file modes (`schema`, `script`, `table`), use `--filename-pattern` to control output filenames:
|
||||
For multi-file modes (`schema`, `domain`, `script`, `table`), use `--filename-pattern` to control output filenames:
|
||||
|
||||
```bash
|
||||
# Default pattern
|
||||
@@ -296,6 +297,13 @@ The data available in templates depends on the execution mode:
|
||||
.Metadata // map[string]interface{} - User metadata
|
||||
```
|
||||
|
||||
### Domain Mode
|
||||
```go
|
||||
.Domain // *models.Domain - Current domain
|
||||
.ParentDatabase // *models.Database - Parent database context
|
||||
.Metadata // map[string]interface{} - User metadata
|
||||
```
|
||||
|
||||
### Table Mode
|
||||
```go
|
||||
.Table // *models.Table - Current table
|
||||
@@ -317,6 +325,7 @@ The data available in templates depends on the execution mode:
|
||||
**Database:**
|
||||
- `.Name` - Database name
|
||||
- `.Schemas` - List of schemas
|
||||
- `.Domains` - List of domains (business domain groupings)
|
||||
- `.Description`, `.Comment` - Documentation
|
||||
|
||||
**Schema:**
|
||||
@@ -325,6 +334,17 @@ The data available in templates depends on the execution mode:
|
||||
- `.Views`, `.Sequences`, `.Scripts` - Other objects
|
||||
- `.Enums` - Enum types
|
||||
|
||||
**Domain:**
|
||||
- `.Name` - Domain name
|
||||
- `.Tables` - List of DomainTable references
|
||||
- `.Description`, `.Comment` - Documentation
|
||||
- `.Metadata` - Custom metadata map
|
||||
|
||||
**DomainTable:**
|
||||
- `.TableName` - Name of the table
|
||||
- `.SchemaName` - Schema containing the table
|
||||
- `.RefTable` - Pointer to actual Table object (if loaded)
|
||||
|
||||
**Table:**
|
||||
- `.Name` - Table name
|
||||
- `.Schema` - Schema name
|
||||
|
||||
7
go.mod
7
go.mod
@@ -3,8 +3,10 @@ module git.warky.dev/wdevs/relspecgo
|
||||
go 1.24.0
|
||||
|
||||
require (
|
||||
github.com/gdamore/tcell/v2 v2.8.1
|
||||
github.com/google/uuid v1.6.0
|
||||
github.com/jackc/pgx/v5 v5.7.6
|
||||
github.com/rivo/tview v0.42.0
|
||||
github.com/spf13/cobra v1.10.2
|
||||
github.com/stretchr/testify v1.11.1
|
||||
github.com/uptrace/bun v1.2.16
|
||||
@@ -14,13 +16,17 @@ require (
|
||||
|
||||
require (
|
||||
github.com/davecgh/go-spew v1.1.1 // indirect
|
||||
github.com/gdamore/encoding v1.0.1 // indirect
|
||||
github.com/inconshreveable/mousetrap v1.1.0 // indirect
|
||||
github.com/jackc/pgpassfile v1.0.0 // indirect
|
||||
github.com/jackc/pgservicefile v0.0.0-20240606120523-5a60cdf6a761 // indirect
|
||||
github.com/jinzhu/inflection v1.0.0 // indirect
|
||||
github.com/kr/pretty v0.3.1 // indirect
|
||||
github.com/lucasb-eyer/go-colorful v1.2.0 // indirect
|
||||
github.com/mattn/go-runewidth v0.0.16 // indirect
|
||||
github.com/pmezard/go-difflib v1.0.0 // indirect
|
||||
github.com/puzpuzpuz/xsync/v3 v3.5.1 // indirect
|
||||
github.com/rivo/uniseg v0.4.7 // indirect
|
||||
github.com/rogpeppe/go-internal v1.14.1 // indirect
|
||||
github.com/spf13/pflag v1.0.10 // indirect
|
||||
github.com/tmthrgd/go-hex v0.0.0-20190904060850-447a3041c3bc // indirect
|
||||
@@ -28,4 +34,5 @@ require (
|
||||
github.com/vmihailenco/tagparser/v2 v2.0.0 // indirect
|
||||
golang.org/x/crypto v0.41.0 // indirect
|
||||
golang.org/x/sys v0.38.0 // indirect
|
||||
golang.org/x/term v0.34.0 // indirect
|
||||
)
|
||||
|
||||
79
go.sum
79
go.sum
@@ -3,6 +3,11 @@ github.com/creack/pty v1.1.9/go.mod h1:oKZEueFk5CKHvIhNR5MUki03XCEU+Q6VDXinZuGJ3
|
||||
github.com/davecgh/go-spew v1.1.0/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
|
||||
github.com/davecgh/go-spew v1.1.1 h1:vj9j/u1bqnvCEfJOwUhtlOARqs3+rkHYY13jYWTU97c=
|
||||
github.com/davecgh/go-spew v1.1.1/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
|
||||
github.com/gdamore/encoding v1.0.1 h1:YzKZckdBL6jVt2Gc+5p82qhrGiqMdG/eNs6Wy0u3Uhw=
|
||||
github.com/gdamore/encoding v1.0.1/go.mod h1:0Z0cMFinngz9kS1QfMjCP8TY7em3bZYeeklsSDPivEo=
|
||||
github.com/gdamore/tcell/v2 v2.8.1 h1:KPNxyqclpWpWQlPLx6Xui1pMk8S+7+R37h3g07997NU=
|
||||
github.com/gdamore/tcell/v2 v2.8.1/go.mod h1:bj8ori1BG3OYMjmb3IklZVWfZUJ1UBQt9JXrOCOhGWw=
|
||||
github.com/google/go-cmp v0.6.0/go.mod h1:17dUlkBOakJ0+DkrSSNjCkIjxS6bF9zb3elmeNGIjoY=
|
||||
github.com/google/uuid v1.6.0 h1:NIvaJDMOsjHA8n1jAhLSgzrAzy1Hgr+hNrb57e+94F0=
|
||||
github.com/google/uuid v1.6.0/go.mod h1:TIyPZe4MgqvfeYDBFedMoGGpEw/LqOeaOT+nhxU+yHo=
|
||||
github.com/inconshreveable/mousetrap v1.1.0 h1:wN+x4NVGpMsO7ErUn/mUI3vEoE6Jt13X2s0bqwp9tc8=
|
||||
@@ -21,11 +26,21 @@ github.com/kr/pretty v0.3.1 h1:flRD4NNwYAUpkphVc1HcthR4KEIFJ65n8Mw5qdRn3LE=
|
||||
github.com/kr/pretty v0.3.1/go.mod h1:hoEshYVHaxMs3cyo3Yncou5ZscifuDolrwPKZanG3xk=
|
||||
github.com/kr/text v0.2.0 h1:5Nx0Ya0ZqY2ygV366QzturHI13Jq95ApcVaJBhpS+AY=
|
||||
github.com/kr/text v0.2.0/go.mod h1:eLer722TekiGuMkidMxC/pM04lWEeraHUUmBw8l2grE=
|
||||
github.com/lucasb-eyer/go-colorful v1.2.0 h1:1nnpGOrhyZZuNyfu1QjKiUICQ74+3FNCN69Aj6K7nkY=
|
||||
github.com/lucasb-eyer/go-colorful v1.2.0/go.mod h1:R4dSotOR9KMtayYi1e77YzuveK+i7ruzyGqttikkLy0=
|
||||
github.com/mattn/go-runewidth v0.0.16 h1:E5ScNMtiwvlvB5paMFdw9p4kSQzbXFikJ5SQO6TULQc=
|
||||
github.com/mattn/go-runewidth v0.0.16/go.mod h1:Jdepj2loyihRzMpdS35Xk/zdY8IAYHsh153qUoGf23w=
|
||||
github.com/pkg/diff v0.0.0-20210226163009-20ebb0f2a09e/go.mod h1:pJLUxLENpZxwdsKMEsNbx1VGcRFpLqf3715MtcvvzbA=
|
||||
github.com/pmezard/go-difflib v1.0.0 h1:4DBwDE0NGyQoBHbLQYPwSUPoCMWR5BEzIk/f1lZbAQM=
|
||||
github.com/pmezard/go-difflib v1.0.0/go.mod h1:iKH77koFhYxTK1pcRnkKkqfTogsbg7gZNVY4sRDYZ/4=
|
||||
github.com/puzpuzpuz/xsync/v3 v3.5.1 h1:GJYJZwO6IdxN/IKbneznS6yPkVC+c3zyY/j19c++5Fg=
|
||||
github.com/puzpuzpuz/xsync/v3 v3.5.1/go.mod h1:VjzYrABPabuM4KyBh1Ftq6u8nhwY5tBPKP9jpmh0nnA=
|
||||
github.com/rivo/tview v0.42.0 h1:b/ftp+RxtDsHSaynXTbJb+/n/BxDEi+W3UfF5jILK6c=
|
||||
github.com/rivo/tview v0.42.0/go.mod h1:cSfIYfhpSGCjp3r/ECJb+GKS7cGJnqV8vfjQPwoXyfY=
|
||||
github.com/rivo/uniseg v0.2.0/go.mod h1:J6wj4VEh+S6ZtnVlnTBMWIodfgj8LQOQFoIToxlJtxc=
|
||||
github.com/rivo/uniseg v0.4.3/go.mod h1:FN3SvrM+Zdj16jyLfmOkMNblXMcoc8DfTHruCPUcx88=
|
||||
github.com/rivo/uniseg v0.4.7 h1:WUdvkW8uEhrYfLC4ZzdpI2ztxP1I582+49Oc5Mq64VQ=
|
||||
github.com/rivo/uniseg v0.4.7/go.mod h1:FN3SvrM+Zdj16jyLfmOkMNblXMcoc8DfTHruCPUcx88=
|
||||
github.com/rogpeppe/go-internal v1.9.0/go.mod h1:WtVeX8xhTBvf0smdhujwtBcq4Qrzq/fJaraNFVN+nFs=
|
||||
github.com/rogpeppe/go-internal v1.14.1 h1:UQB4HGPB6osV0SQTLymcB4TgvyWu6ZyliaW0tI/otEQ=
|
||||
github.com/rogpeppe/go-internal v1.14.1/go.mod h1:MaRKkUm5W0goXpeCfT7UZI6fk/L7L7so1lCWt35ZSgc=
|
||||
@@ -48,15 +63,79 @@ github.com/vmihailenco/msgpack/v5 v5.4.1 h1:cQriyiUvjTwOHg8QZaPihLWeRAAVoCpE00IU
|
||||
github.com/vmihailenco/msgpack/v5 v5.4.1/go.mod h1:GaZTsDaehaPpQVyxrf5mtQlH+pc21PIudVV/E3rRQok=
|
||||
github.com/vmihailenco/tagparser/v2 v2.0.0 h1:y09buUbR+b5aycVFQs/g70pqKVZNBmxwAhO7/IwNM9g=
|
||||
github.com/vmihailenco/tagparser/v2 v2.0.0/go.mod h1:Wri+At7QHww0WTrCBeu4J6bNtoV6mEfg5OIWRZA9qds=
|
||||
github.com/yuin/goldmark v1.4.13/go.mod h1:6yULJ656Px+3vBD8DxQVa3kxgyrAnzto9xy5taEt/CY=
|
||||
go.yaml.in/yaml/v3 v3.0.4/go.mod h1:DhzuOOF2ATzADvBadXxruRBLzYTpT36CKvDb3+aBEFg=
|
||||
golang.org/x/crypto v0.0.0-20190308221718-c2843e01d9a2/go.mod h1:djNgcEr1/C05ACkg1iLfiJU5Ep61QUkGW8qpdssI0+w=
|
||||
golang.org/x/crypto v0.0.0-20210921155107-089bfa567519/go.mod h1:GvvjBRRGRdwPK5ydBHafDWAxML/pGHZbMvKqRZ5+Abc=
|
||||
golang.org/x/crypto v0.13.0/go.mod h1:y6Z2r+Rw4iayiXXAIxJIDAJ1zMW4yaTpebo8fPOliYc=
|
||||
golang.org/x/crypto v0.19.0/go.mod h1:Iy9bg/ha4yyC70EfRS8jz+B6ybOBKMaSxLj6P6oBDfU=
|
||||
golang.org/x/crypto v0.23.0/go.mod h1:CKFgDieR+mRhux2Lsu27y0fO304Db0wZe70UKqHu0v8=
|
||||
golang.org/x/crypto v0.41.0 h1:WKYxWedPGCTVVl5+WHSSrOBT0O8lx32+zxmHxijgXp4=
|
||||
golang.org/x/crypto v0.41.0/go.mod h1:pO5AFd7FA68rFak7rOAGVuygIISepHftHnr8dr6+sUc=
|
||||
golang.org/x/mod v0.6.0-dev.0.20220419223038-86c51ed26bb4/go.mod h1:jJ57K6gSWd91VN4djpZkiMVwK6gcyfeH4XE8wZrZaV4=
|
||||
golang.org/x/mod v0.8.0/go.mod h1:iBbtSCu2XBx23ZKBPSOrRkjjQPZFPuis4dIYUhu/chs=
|
||||
golang.org/x/mod v0.12.0/go.mod h1:iBbtSCu2XBx23ZKBPSOrRkjjQPZFPuis4dIYUhu/chs=
|
||||
golang.org/x/mod v0.15.0/go.mod h1:hTbmBsO62+eylJbnUtE2MGJUyE7QWk4xUqPFrRgJ+7c=
|
||||
golang.org/x/mod v0.17.0/go.mod h1:hTbmBsO62+eylJbnUtE2MGJUyE7QWk4xUqPFrRgJ+7c=
|
||||
golang.org/x/net v0.0.0-20190620200207-3b0461eec859/go.mod h1:z5CRVTTTmAJ677TzLLGU+0bjPO0LkuOLi4/5GtJWs/s=
|
||||
golang.org/x/net v0.0.0-20210226172049-e18ecbb05110/go.mod h1:m0MpNAwzfU5UDzcl9v0D8zg8gWTRqZa9RBIspLL5mdg=
|
||||
golang.org/x/net v0.0.0-20220722155237-a158d28d115b/go.mod h1:XRhObCWvk6IyKnWLug+ECip1KBveYUHfp+8e9klMJ9c=
|
||||
golang.org/x/net v0.6.0/go.mod h1:2Tu9+aMcznHK/AK1HMvgo6xiTLG5rD5rZLDS+rp2Bjs=
|
||||
golang.org/x/net v0.10.0/go.mod h1:0qNGK6F8kojg2nk9dLZ2mShWaEBan6FAoqfSigmmuDg=
|
||||
golang.org/x/net v0.15.0/go.mod h1:idbUs1IY1+zTqbi8yxTbhexhEEk5ur9LInksu6HrEpk=
|
||||
golang.org/x/net v0.21.0/go.mod h1:bIjVDfnllIU7BJ2DNgfnXvpSvtn8VRwhlsaeUTyUS44=
|
||||
golang.org/x/net v0.25.0/go.mod h1:JkAGAh7GEvH74S6FOH42FLoXpXbE/aqXSrIQjXgsiwM=
|
||||
golang.org/x/sync v0.0.0-20190423024810-112230192c58/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
|
||||
golang.org/x/sync v0.0.0-20220722155255-886fb9371eb4/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
|
||||
golang.org/x/sync v0.1.0/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
|
||||
golang.org/x/sync v0.3.0/go.mod h1:FU7BRWz2tNW+3quACPkgCx/L+uEAv1htQ0V83Z9Rj+Y=
|
||||
golang.org/x/sync v0.6.0/go.mod h1:Czt+wKu1gCyEFDUtn0jG5QVvpJ6rzVqr5aXyt9drQfk=
|
||||
golang.org/x/sync v0.7.0/go.mod h1:Czt+wKu1gCyEFDUtn0jG5QVvpJ6rzVqr5aXyt9drQfk=
|
||||
golang.org/x/sync v0.10.0/go.mod h1:Czt+wKu1gCyEFDUtn0jG5QVvpJ6rzVqr5aXyt9drQfk=
|
||||
golang.org/x/sync v0.16.0 h1:ycBJEhp9p4vXvUZNszeOq0kGTPghopOL8q0fq3vstxw=
|
||||
golang.org/x/sync v0.16.0/go.mod h1:1dzgHSNfp02xaA81J2MS99Qcpr2w7fw1gpm99rleRqA=
|
||||
golang.org/x/sys v0.0.0-20190215142949-d0b11bdaac8a/go.mod h1:STP8DvDyc/dI5b8T5hshtkjS+E42TnysNCUPdjciGhY=
|
||||
golang.org/x/sys v0.0.0-20201119102817-f84b799fce68/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
|
||||
golang.org/x/sys v0.0.0-20210615035016-665e8c7367d1/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
|
||||
golang.org/x/sys v0.0.0-20220520151302-bc2c85ada10a/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
|
||||
golang.org/x/sys v0.0.0-20220722155257-8c9f86f7a55f/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
|
||||
golang.org/x/sys v0.5.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
|
||||
golang.org/x/sys v0.8.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
|
||||
golang.org/x/sys v0.12.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
|
||||
golang.org/x/sys v0.17.0/go.mod h1:/VUhepiaJMQUp4+oa/7Zr1D23ma6VTLIYjOOTFZPUcA=
|
||||
golang.org/x/sys v0.20.0/go.mod h1:/VUhepiaJMQUp4+oa/7Zr1D23ma6VTLIYjOOTFZPUcA=
|
||||
golang.org/x/sys v0.29.0/go.mod h1:/VUhepiaJMQUp4+oa/7Zr1D23ma6VTLIYjOOTFZPUcA=
|
||||
golang.org/x/sys v0.38.0 h1:3yZWxaJjBmCWXqhN1qh02AkOnCQ1poK6oF+a7xWL6Gc=
|
||||
golang.org/x/sys v0.38.0/go.mod h1:OgkHotnGiDImocRcuBABYBEXf8A9a87e/uXjp9XT3ks=
|
||||
golang.org/x/telemetry v0.0.0-20240228155512-f48c80bd79b2/go.mod h1:TeRTkGYfJXctD9OcfyVLyj2J3IxLnKwHJR8f4D8a3YE=
|
||||
golang.org/x/term v0.0.0-20201126162022-7de9c90e9dd1/go.mod h1:bj7SfCRtBDWHUb9snDiAeCFNEtKQo2Wmx5Cou7ajbmo=
|
||||
golang.org/x/term v0.0.0-20210927222741-03fcf44c2211/go.mod h1:jbD1KX2456YbFQfuXm/mYQcufACuNUgVhRMnK/tPxf8=
|
||||
golang.org/x/term v0.5.0/go.mod h1:jMB1sMXY+tzblOD4FWmEbocvup2/aLOaQEp7JmGp78k=
|
||||
golang.org/x/term v0.8.0/go.mod h1:xPskH00ivmX89bAKVGSKKtLOWNx2+17Eiy94tnKShWo=
|
||||
golang.org/x/term v0.12.0/go.mod h1:owVbMEjm3cBLCHdkQu9b1opXd4ETQWc3BhuQGKgXgvU=
|
||||
golang.org/x/term v0.17.0/go.mod h1:lLRBjIVuehSbZlaOtGMbcMncT+aqLLLmKrsjNrUguwk=
|
||||
golang.org/x/term v0.20.0/go.mod h1:8UkIAJTvZgivsXaD6/pH6U9ecQzZ45awqEOzuCvwpFY=
|
||||
golang.org/x/term v0.28.0/go.mod h1:Sw/lC2IAUZ92udQNf3WodGtn4k/XoLyZoh8v/8uiwek=
|
||||
golang.org/x/term v0.34.0 h1:O/2T7POpk0ZZ7MAzMeWFSg6S5IpWd/RXDlM9hgM3DR4=
|
||||
golang.org/x/term v0.34.0/go.mod h1:5jC53AEywhIVebHgPVeg0mj8OD3VO9OzclacVrqpaAw=
|
||||
golang.org/x/text v0.3.0/go.mod h1:NqM8EUOU14njkJ3fqMW+pc6Ldnwhi/IjpwHt7yyuwOQ=
|
||||
golang.org/x/text v0.3.3/go.mod h1:5Zoc/QRtKVWzQhOtBMvqHzDpF6irO9z98xDceosuGiQ=
|
||||
golang.org/x/text v0.3.7/go.mod h1:u+2+/6zg+i71rQMx5EYifcz6MCKuco9NR6JIITiCfzQ=
|
||||
golang.org/x/text v0.7.0/go.mod h1:mrYo+phRRbMaCq/xk9113O4dZlRixOauAjOtrjsXDZ8=
|
||||
golang.org/x/text v0.9.0/go.mod h1:e1OnstbJyHTd6l/uOt8jFFHp6TRDWZR/bV3emEE/zU8=
|
||||
golang.org/x/text v0.13.0/go.mod h1:TvPlkZtksWOMsz7fbANvkp4WM8x/WCo/om8BMLbz+aE=
|
||||
golang.org/x/text v0.14.0/go.mod h1:18ZOQIKpY8NJVqYksKHtTdi31H5itFRjB5/qKTNYzSU=
|
||||
golang.org/x/text v0.15.0/go.mod h1:18ZOQIKpY8NJVqYksKHtTdi31H5itFRjB5/qKTNYzSU=
|
||||
golang.org/x/text v0.21.0/go.mod h1:4IBbMaMmOPCJ8SecivzSH54+73PCFmPWxNTLm+vZkEQ=
|
||||
golang.org/x/text v0.28.0 h1:rhazDwis8INMIwQ4tpjLDzUhx6RlXqZNPEM0huQojng=
|
||||
golang.org/x/text v0.28.0/go.mod h1:U8nCwOR8jO/marOQ0QbDiOngZVEBB7MAiitBuMjXiNU=
|
||||
golang.org/x/tools v0.0.0-20180917221912-90fa682c2a6e/go.mod h1:n7NCudcB/nEzxVGmLbDWY5pfWTLqBcC2KZ6jyYvM4mQ=
|
||||
golang.org/x/tools v0.0.0-20191119224855-298f0cb1881e/go.mod h1:b+2E5dAYhXwXZwtnZ6UAqBI28+e2cm9otk0dWdXHAEo=
|
||||
golang.org/x/tools v0.1.12/go.mod h1:hNGJHUnrk76NpqgfD5Aqm5Crs+Hm0VOH/i9J2+nxYbc=
|
||||
golang.org/x/tools v0.6.0/go.mod h1:Xwgl3UAJ/d3gWutnCtw505GrjyAbvKui8lOU390QaIU=
|
||||
golang.org/x/tools v0.13.0/go.mod h1:HvlwmtVNQAhOuCjW7xxvovg8wbNq7LwfXh/k7wXUl58=
|
||||
golang.org/x/tools v0.21.1-0.20240508182429-e35e4ccd0d2d/go.mod h1:aiJjzUbINMkxbQROHiO6hDPo2LHcIPhhQsa9DLh0yGk=
|
||||
golang.org/x/xerrors v0.0.0-20190717185122-a985d3407aa7/go.mod h1:I/5z698sn9Ka8TeJc9MKroUUfqBBauWjQqLJ2OPfmY0=
|
||||
gopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=
|
||||
gopkg.in/check.v1 v1.0.0-20201130134442-10cb98267c6c h1:Hei/4ADfdWqJk1ZMxUNpqntNwaWcugrBjAiHlqqRiVk=
|
||||
gopkg.in/check.v1 v1.0.0-20201130134442-10cb98267c6c/go.mod h1:JHkPIbrfpd72SG/EVd6muEfDQjcINNoR0C8j2r3qZ4Q=
|
||||
|
||||
574
pkg/merge/merge.go
Normal file
574
pkg/merge/merge.go
Normal file
@@ -0,0 +1,574 @@
|
||||
// Package merge provides utilities for merging database schemas.
|
||||
// It allows combining schemas from multiple sources while avoiding duplicates,
|
||||
// supporting only additive operations (no deletion or modification of existing items).
|
||||
package merge
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"strings"
|
||||
|
||||
"git.warky.dev/wdevs/relspecgo/pkg/models"
|
||||
)
|
||||
|
||||
// MergeResult represents the result of a merge operation
|
||||
type MergeResult struct {
|
||||
SchemasAdded int
|
||||
TablesAdded int
|
||||
ColumnsAdded int
|
||||
RelationsAdded int
|
||||
DomainsAdded int
|
||||
EnumsAdded int
|
||||
ViewsAdded int
|
||||
SequencesAdded int
|
||||
}
|
||||
|
||||
// MergeOptions contains options for merge operations
|
||||
type MergeOptions struct {
|
||||
SkipDomains bool
|
||||
SkipRelations bool
|
||||
SkipEnums bool
|
||||
SkipViews bool
|
||||
SkipSequences bool
|
||||
SkipTableNames map[string]bool // Tables to skip during merge (keyed by table name)
|
||||
}
|
||||
|
||||
// MergeDatabases merges the source database into the target database.
|
||||
// Only adds missing items; existing items are not modified.
|
||||
func MergeDatabases(target, source *models.Database, opts *MergeOptions) *MergeResult {
|
||||
if opts == nil {
|
||||
opts = &MergeOptions{}
|
||||
}
|
||||
|
||||
result := &MergeResult{}
|
||||
|
||||
if target == nil || source == nil {
|
||||
return result
|
||||
}
|
||||
|
||||
// Merge schemas and their contents
|
||||
result.merge(target, source, opts)
|
||||
|
||||
return result
|
||||
}
|
||||
|
||||
func (r *MergeResult) merge(target, source *models.Database, opts *MergeOptions) {
|
||||
// Create maps of existing schemas for quick lookup
|
||||
existingSchemas := make(map[string]*models.Schema)
|
||||
for _, schema := range target.Schemas {
|
||||
existingSchemas[schema.SQLName()] = schema
|
||||
}
|
||||
|
||||
// Merge schemas
|
||||
for _, srcSchema := range source.Schemas {
|
||||
schemaName := srcSchema.SQLName()
|
||||
if tgtSchema, exists := existingSchemas[schemaName]; exists {
|
||||
// Schema exists, merge its contents
|
||||
r.mergeSchemaContents(tgtSchema, srcSchema, opts)
|
||||
} else {
|
||||
// Schema doesn't exist, add it
|
||||
newSchema := cloneSchema(srcSchema)
|
||||
target.Schemas = append(target.Schemas, newSchema)
|
||||
r.SchemasAdded++
|
||||
}
|
||||
}
|
||||
|
||||
// Merge domains if not skipped
|
||||
if !opts.SkipDomains {
|
||||
r.mergeDomains(target, source)
|
||||
}
|
||||
}
|
||||
|
||||
func (r *MergeResult) mergeSchemaContents(target, source *models.Schema, opts *MergeOptions) {
|
||||
// Merge tables
|
||||
r.mergeTables(target, source, opts)
|
||||
|
||||
// Merge views if not skipped
|
||||
if !opts.SkipViews {
|
||||
r.mergeViews(target, source)
|
||||
}
|
||||
|
||||
// Merge sequences if not skipped
|
||||
if !opts.SkipSequences {
|
||||
r.mergeSequences(target, source)
|
||||
}
|
||||
|
||||
// Merge enums if not skipped
|
||||
if !opts.SkipEnums {
|
||||
r.mergeEnums(target, source)
|
||||
}
|
||||
|
||||
// Merge relations if not skipped
|
||||
if !opts.SkipRelations {
|
||||
r.mergeRelations(target, source)
|
||||
}
|
||||
}
|
||||
|
||||
func (r *MergeResult) mergeTables(schema *models.Schema, source *models.Schema, opts *MergeOptions) {
|
||||
// Create map of existing tables
|
||||
existingTables := make(map[string]*models.Table)
|
||||
for _, table := range schema.Tables {
|
||||
existingTables[table.SQLName()] = table
|
||||
}
|
||||
|
||||
// Merge tables
|
||||
for _, srcTable := range source.Tables {
|
||||
tableName := srcTable.SQLName()
|
||||
|
||||
// Skip if table is in the skip list (case-insensitive)
|
||||
if opts != nil && opts.SkipTableNames != nil && opts.SkipTableNames[strings.ToLower(tableName)] {
|
||||
continue
|
||||
}
|
||||
|
||||
if tgtTable, exists := existingTables[tableName]; exists {
|
||||
// Table exists, merge its columns
|
||||
r.mergeColumns(tgtTable, srcTable)
|
||||
} else {
|
||||
// Table doesn't exist, add it
|
||||
newTable := cloneTable(srcTable)
|
||||
schema.Tables = append(schema.Tables, newTable)
|
||||
r.TablesAdded++
|
||||
// Count columns in the newly added table
|
||||
r.ColumnsAdded += len(newTable.Columns)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
func (r *MergeResult) mergeColumns(table *models.Table, srcTable *models.Table) {
|
||||
// Create map of existing columns
|
||||
existingColumns := make(map[string]*models.Column)
|
||||
for colName := range table.Columns {
|
||||
existingColumns[colName] = table.Columns[colName]
|
||||
}
|
||||
|
||||
// Merge columns
|
||||
for colName, srcCol := range srcTable.Columns {
|
||||
if _, exists := existingColumns[colName]; !exists {
|
||||
// Column doesn't exist, add it
|
||||
newCol := cloneColumn(srcCol)
|
||||
table.Columns[colName] = newCol
|
||||
r.ColumnsAdded++
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
func (r *MergeResult) mergeViews(schema *models.Schema, source *models.Schema) {
|
||||
// Create map of existing views
|
||||
existingViews := make(map[string]*models.View)
|
||||
for _, view := range schema.Views {
|
||||
existingViews[view.SQLName()] = view
|
||||
}
|
||||
|
||||
// Merge views
|
||||
for _, srcView := range source.Views {
|
||||
viewName := srcView.SQLName()
|
||||
if _, exists := existingViews[viewName]; !exists {
|
||||
// View doesn't exist, add it
|
||||
newView := cloneView(srcView)
|
||||
schema.Views = append(schema.Views, newView)
|
||||
r.ViewsAdded++
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
func (r *MergeResult) mergeSequences(schema *models.Schema, source *models.Schema) {
|
||||
// Create map of existing sequences
|
||||
existingSequences := make(map[string]*models.Sequence)
|
||||
for _, seq := range schema.Sequences {
|
||||
existingSequences[seq.SQLName()] = seq
|
||||
}
|
||||
|
||||
// Merge sequences
|
||||
for _, srcSeq := range source.Sequences {
|
||||
seqName := srcSeq.SQLName()
|
||||
if _, exists := existingSequences[seqName]; !exists {
|
||||
// Sequence doesn't exist, add it
|
||||
newSeq := cloneSequence(srcSeq)
|
||||
schema.Sequences = append(schema.Sequences, newSeq)
|
||||
r.SequencesAdded++
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
func (r *MergeResult) mergeEnums(schema *models.Schema, source *models.Schema) {
|
||||
// Create map of existing enums
|
||||
existingEnums := make(map[string]*models.Enum)
|
||||
for _, enum := range schema.Enums {
|
||||
existingEnums[enum.SQLName()] = enum
|
||||
}
|
||||
|
||||
// Merge enums
|
||||
for _, srcEnum := range source.Enums {
|
||||
enumName := srcEnum.SQLName()
|
||||
if _, exists := existingEnums[enumName]; !exists {
|
||||
// Enum doesn't exist, add it
|
||||
newEnum := cloneEnum(srcEnum)
|
||||
schema.Enums = append(schema.Enums, newEnum)
|
||||
r.EnumsAdded++
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
func (r *MergeResult) mergeRelations(schema *models.Schema, source *models.Schema) {
|
||||
// Create map of existing relations
|
||||
existingRelations := make(map[string]*models.Relationship)
|
||||
for _, rel := range schema.Relations {
|
||||
existingRelations[rel.SQLName()] = rel
|
||||
}
|
||||
|
||||
// Merge relations
|
||||
for _, srcRel := range source.Relations {
|
||||
if _, exists := existingRelations[srcRel.SQLName()]; !exists {
|
||||
// Relation doesn't exist, add it
|
||||
newRel := cloneRelation(srcRel)
|
||||
schema.Relations = append(schema.Relations, newRel)
|
||||
r.RelationsAdded++
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
func (r *MergeResult) mergeDomains(target *models.Database, source *models.Database) {
|
||||
// Create map of existing domains
|
||||
existingDomains := make(map[string]*models.Domain)
|
||||
for _, domain := range target.Domains {
|
||||
existingDomains[domain.SQLName()] = domain
|
||||
}
|
||||
|
||||
// Merge domains
|
||||
for _, srcDomain := range source.Domains {
|
||||
domainName := srcDomain.SQLName()
|
||||
if _, exists := existingDomains[domainName]; !exists {
|
||||
// Domain doesn't exist, add it
|
||||
newDomain := cloneDomain(srcDomain)
|
||||
target.Domains = append(target.Domains, newDomain)
|
||||
r.DomainsAdded++
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Clone functions to create deep copies of models
|
||||
|
||||
func cloneSchema(schema *models.Schema) *models.Schema {
|
||||
if schema == nil {
|
||||
return nil
|
||||
}
|
||||
newSchema := &models.Schema{
|
||||
Name: schema.Name,
|
||||
Description: schema.Description,
|
||||
Owner: schema.Owner,
|
||||
Comment: schema.Comment,
|
||||
Sequence: schema.Sequence,
|
||||
UpdatedAt: schema.UpdatedAt,
|
||||
Tables: make([]*models.Table, 0),
|
||||
Views: make([]*models.View, 0),
|
||||
Sequences: make([]*models.Sequence, 0),
|
||||
Enums: make([]*models.Enum, 0),
|
||||
Relations: make([]*models.Relationship, 0),
|
||||
}
|
||||
|
||||
if schema.Permissions != nil {
|
||||
newSchema.Permissions = make(map[string]string)
|
||||
for k, v := range schema.Permissions {
|
||||
newSchema.Permissions[k] = v
|
||||
}
|
||||
}
|
||||
|
||||
if schema.Metadata != nil {
|
||||
newSchema.Metadata = make(map[string]interface{})
|
||||
for k, v := range schema.Metadata {
|
||||
newSchema.Metadata[k] = v
|
||||
}
|
||||
}
|
||||
|
||||
if schema.Scripts != nil {
|
||||
newSchema.Scripts = make([]*models.Script, len(schema.Scripts))
|
||||
copy(newSchema.Scripts, schema.Scripts)
|
||||
}
|
||||
|
||||
// Clone tables
|
||||
for _, table := range schema.Tables {
|
||||
newSchema.Tables = append(newSchema.Tables, cloneTable(table))
|
||||
}
|
||||
|
||||
// Clone views
|
||||
for _, view := range schema.Views {
|
||||
newSchema.Views = append(newSchema.Views, cloneView(view))
|
||||
}
|
||||
|
||||
// Clone sequences
|
||||
for _, seq := range schema.Sequences {
|
||||
newSchema.Sequences = append(newSchema.Sequences, cloneSequence(seq))
|
||||
}
|
||||
|
||||
// Clone enums
|
||||
for _, enum := range schema.Enums {
|
||||
newSchema.Enums = append(newSchema.Enums, cloneEnum(enum))
|
||||
}
|
||||
|
||||
// Clone relations
|
||||
for _, rel := range schema.Relations {
|
||||
newSchema.Relations = append(newSchema.Relations, cloneRelation(rel))
|
||||
}
|
||||
|
||||
return newSchema
|
||||
}
|
||||
|
||||
func cloneTable(table *models.Table) *models.Table {
|
||||
if table == nil {
|
||||
return nil
|
||||
}
|
||||
newTable := &models.Table{
|
||||
Name: table.Name,
|
||||
Description: table.Description,
|
||||
Schema: table.Schema,
|
||||
Comment: table.Comment,
|
||||
Sequence: table.Sequence,
|
||||
UpdatedAt: table.UpdatedAt,
|
||||
Columns: make(map[string]*models.Column),
|
||||
Constraints: make(map[string]*models.Constraint),
|
||||
Indexes: make(map[string]*models.Index),
|
||||
}
|
||||
|
||||
if table.Metadata != nil {
|
||||
newTable.Metadata = make(map[string]interface{})
|
||||
for k, v := range table.Metadata {
|
||||
newTable.Metadata[k] = v
|
||||
}
|
||||
}
|
||||
|
||||
// Clone columns
|
||||
for colName, col := range table.Columns {
|
||||
newTable.Columns[colName] = cloneColumn(col)
|
||||
}
|
||||
|
||||
// Clone constraints
|
||||
for constName, constraint := range table.Constraints {
|
||||
newTable.Constraints[constName] = cloneConstraint(constraint)
|
||||
}
|
||||
|
||||
// Clone indexes
|
||||
for idxName, index := range table.Indexes {
|
||||
newTable.Indexes[idxName] = cloneIndex(index)
|
||||
}
|
||||
|
||||
return newTable
|
||||
}
|
||||
|
||||
func cloneColumn(col *models.Column) *models.Column {
|
||||
if col == nil {
|
||||
return nil
|
||||
}
|
||||
newCol := &models.Column{
|
||||
Name: col.Name,
|
||||
Type: col.Type,
|
||||
Description: col.Description,
|
||||
Comment: col.Comment,
|
||||
IsPrimaryKey: col.IsPrimaryKey,
|
||||
NotNull: col.NotNull,
|
||||
Default: col.Default,
|
||||
Precision: col.Precision,
|
||||
Scale: col.Scale,
|
||||
Length: col.Length,
|
||||
Sequence: col.Sequence,
|
||||
AutoIncrement: col.AutoIncrement,
|
||||
Collation: col.Collation,
|
||||
}
|
||||
|
||||
return newCol
|
||||
}
|
||||
|
||||
func cloneConstraint(constraint *models.Constraint) *models.Constraint {
|
||||
if constraint == nil {
|
||||
return nil
|
||||
}
|
||||
newConstraint := &models.Constraint{
|
||||
Type: constraint.Type,
|
||||
Columns: make([]string, len(constraint.Columns)),
|
||||
ReferencedTable: constraint.ReferencedTable,
|
||||
ReferencedSchema: constraint.ReferencedSchema,
|
||||
ReferencedColumns: make([]string, len(constraint.ReferencedColumns)),
|
||||
OnUpdate: constraint.OnUpdate,
|
||||
OnDelete: constraint.OnDelete,
|
||||
Expression: constraint.Expression,
|
||||
Name: constraint.Name,
|
||||
Deferrable: constraint.Deferrable,
|
||||
InitiallyDeferred: constraint.InitiallyDeferred,
|
||||
Sequence: constraint.Sequence,
|
||||
}
|
||||
copy(newConstraint.Columns, constraint.Columns)
|
||||
copy(newConstraint.ReferencedColumns, constraint.ReferencedColumns)
|
||||
return newConstraint
|
||||
}
|
||||
|
||||
func cloneIndex(index *models.Index) *models.Index {
|
||||
if index == nil {
|
||||
return nil
|
||||
}
|
||||
newIndex := &models.Index{
|
||||
Name: index.Name,
|
||||
Description: index.Description,
|
||||
Table: index.Table,
|
||||
Schema: index.Schema,
|
||||
Columns: make([]string, len(index.Columns)),
|
||||
Unique: index.Unique,
|
||||
Type: index.Type,
|
||||
Where: index.Where,
|
||||
Concurrent: index.Concurrent,
|
||||
Include: make([]string, len(index.Include)),
|
||||
Comment: index.Comment,
|
||||
Sequence: index.Sequence,
|
||||
}
|
||||
copy(newIndex.Columns, index.Columns)
|
||||
copy(newIndex.Include, index.Include)
|
||||
return newIndex
|
||||
}
|
||||
|
||||
func cloneView(view *models.View) *models.View {
|
||||
if view == nil {
|
||||
return nil
|
||||
}
|
||||
newView := &models.View{
|
||||
Name: view.Name,
|
||||
Description: view.Description,
|
||||
Schema: view.Schema,
|
||||
Definition: view.Definition,
|
||||
Comment: view.Comment,
|
||||
Sequence: view.Sequence,
|
||||
Columns: make(map[string]*models.Column),
|
||||
}
|
||||
|
||||
if view.Metadata != nil {
|
||||
newView.Metadata = make(map[string]interface{})
|
||||
for k, v := range view.Metadata {
|
||||
newView.Metadata[k] = v
|
||||
}
|
||||
}
|
||||
|
||||
// Clone columns
|
||||
for colName, col := range view.Columns {
|
||||
newView.Columns[colName] = cloneColumn(col)
|
||||
}
|
||||
|
||||
return newView
|
||||
}
|
||||
|
||||
func cloneSequence(seq *models.Sequence) *models.Sequence {
|
||||
if seq == nil {
|
||||
return nil
|
||||
}
|
||||
newSeq := &models.Sequence{
|
||||
Name: seq.Name,
|
||||
Description: seq.Description,
|
||||
Schema: seq.Schema,
|
||||
StartValue: seq.StartValue,
|
||||
MinValue: seq.MinValue,
|
||||
MaxValue: seq.MaxValue,
|
||||
IncrementBy: seq.IncrementBy,
|
||||
CacheSize: seq.CacheSize,
|
||||
Cycle: seq.Cycle,
|
||||
OwnedByTable: seq.OwnedByTable,
|
||||
OwnedByColumn: seq.OwnedByColumn,
|
||||
Comment: seq.Comment,
|
||||
Sequence: seq.Sequence,
|
||||
}
|
||||
return newSeq
|
||||
}
|
||||
|
||||
func cloneEnum(enum *models.Enum) *models.Enum {
|
||||
if enum == nil {
|
||||
return nil
|
||||
}
|
||||
newEnum := &models.Enum{
|
||||
Name: enum.Name,
|
||||
Values: make([]string, len(enum.Values)),
|
||||
Schema: enum.Schema,
|
||||
}
|
||||
copy(newEnum.Values, enum.Values)
|
||||
return newEnum
|
||||
}
|
||||
|
||||
func cloneRelation(rel *models.Relationship) *models.Relationship {
|
||||
if rel == nil {
|
||||
return nil
|
||||
}
|
||||
newRel := &models.Relationship{
|
||||
Name: rel.Name,
|
||||
Type: rel.Type,
|
||||
FromTable: rel.FromTable,
|
||||
FromSchema: rel.FromSchema,
|
||||
FromColumns: make([]string, len(rel.FromColumns)),
|
||||
ToTable: rel.ToTable,
|
||||
ToSchema: rel.ToSchema,
|
||||
ToColumns: make([]string, len(rel.ToColumns)),
|
||||
ForeignKey: rel.ForeignKey,
|
||||
ThroughTable: rel.ThroughTable,
|
||||
ThroughSchema: rel.ThroughSchema,
|
||||
Description: rel.Description,
|
||||
Sequence: rel.Sequence,
|
||||
}
|
||||
|
||||
if rel.Properties != nil {
|
||||
newRel.Properties = make(map[string]string)
|
||||
for k, v := range rel.Properties {
|
||||
newRel.Properties[k] = v
|
||||
}
|
||||
}
|
||||
|
||||
copy(newRel.FromColumns, rel.FromColumns)
|
||||
copy(newRel.ToColumns, rel.ToColumns)
|
||||
return newRel
|
||||
}
|
||||
|
||||
func cloneDomain(domain *models.Domain) *models.Domain {
|
||||
if domain == nil {
|
||||
return nil
|
||||
}
|
||||
newDomain := &models.Domain{
|
||||
Name: domain.Name,
|
||||
Description: domain.Description,
|
||||
Comment: domain.Comment,
|
||||
Sequence: domain.Sequence,
|
||||
Tables: make([]*models.DomainTable, len(domain.Tables)),
|
||||
}
|
||||
|
||||
if domain.Metadata != nil {
|
||||
newDomain.Metadata = make(map[string]interface{})
|
||||
for k, v := range domain.Metadata {
|
||||
newDomain.Metadata[k] = v
|
||||
}
|
||||
}
|
||||
|
||||
copy(newDomain.Tables, domain.Tables)
|
||||
return newDomain
|
||||
}
|
||||
|
||||
// GetMergeSummary returns a human-readable summary of the merge result
|
||||
func GetMergeSummary(result *MergeResult) string {
|
||||
if result == nil {
|
||||
return "No merge result available"
|
||||
}
|
||||
|
||||
lines := []string{
|
||||
"=== Merge Summary ===",
|
||||
fmt.Sprintf("Schemas added: %d", result.SchemasAdded),
|
||||
fmt.Sprintf("Tables added: %d", result.TablesAdded),
|
||||
fmt.Sprintf("Columns added: %d", result.ColumnsAdded),
|
||||
fmt.Sprintf("Views added: %d", result.ViewsAdded),
|
||||
fmt.Sprintf("Sequences added: %d", result.SequencesAdded),
|
||||
fmt.Sprintf("Enums added: %d", result.EnumsAdded),
|
||||
fmt.Sprintf("Relations added: %d", result.RelationsAdded),
|
||||
fmt.Sprintf("Domains added: %d", result.DomainsAdded),
|
||||
}
|
||||
|
||||
totalAdded := result.SchemasAdded + result.TablesAdded + result.ColumnsAdded +
|
||||
result.ViewsAdded + result.SequencesAdded + result.EnumsAdded +
|
||||
result.RelationsAdded + result.DomainsAdded
|
||||
|
||||
lines = append(lines, fmt.Sprintf("Total items added: %d", totalAdded))
|
||||
|
||||
summary := ""
|
||||
for _, line := range lines {
|
||||
summary += line + "\n"
|
||||
}
|
||||
|
||||
return summary
|
||||
}
|
||||
@@ -4,7 +4,12 @@
|
||||
// intermediate representation for converting between various database schema formats.
|
||||
package models
|
||||
|
||||
import "strings"
|
||||
import (
|
||||
"strings"
|
||||
"time"
|
||||
|
||||
"github.com/google/uuid"
|
||||
)
|
||||
|
||||
// DatabaseType represents the type of database system.
|
||||
type DatabaseType string
|
||||
@@ -21,10 +26,13 @@ type Database struct {
|
||||
Name string `json:"name" yaml:"name"`
|
||||
Description string `json:"description,omitempty" yaml:"description,omitempty" xml:"description,omitempty"`
|
||||
Schemas []*Schema `json:"schemas" yaml:"schemas" xml:"schemas"`
|
||||
Domains []*Domain `json:"domains,omitempty" yaml:"domains,omitempty" xml:"domains,omitempty"`
|
||||
Comment string `json:"comment,omitempty" yaml:"comment,omitempty" xml:"comment,omitempty"`
|
||||
DatabaseType DatabaseType `json:"database_type,omitempty" yaml:"database_type,omitempty" xml:"database_type,omitempty"`
|
||||
DatabaseVersion string `json:"database_version,omitempty" yaml:"database_version,omitempty" xml:"database_version,omitempty"`
|
||||
SourceFormat string `json:"source_format,omitempty" yaml:"source_format,omitempty" xml:"source_format,omitempty"` // Source Format of the database.
|
||||
UpdatedAt string `json:"updatedat,omitempty" yaml:"updatedat,omitempty" xml:"updatedat,omitempty"`
|
||||
GUID string `json:"guid" yaml:"guid" xml:"guid"`
|
||||
}
|
||||
|
||||
// SQLName returns the database name in lowercase for SQL compatibility.
|
||||
@@ -32,6 +40,39 @@ func (d *Database) SQLName() string {
|
||||
return strings.ToLower(d.Name)
|
||||
}
|
||||
|
||||
// UpdateDate sets the UpdatedAt field to the current time in RFC3339 format.
|
||||
func (d *Database) UpdateDate() {
|
||||
d.UpdatedAt = time.Now().Format(time.RFC3339)
|
||||
}
|
||||
|
||||
// Domain represents a logical business domain grouping multiple tables from potentially different schemas.
|
||||
// Domains allow for organizing database tables by functional areas (e.g., authentication, user data, financial).
|
||||
type Domain struct {
|
||||
Name string `json:"name" yaml:"name" xml:"name"`
|
||||
Description string `json:"description,omitempty" yaml:"description,omitempty" xml:"description,omitempty"`
|
||||
Tables []*DomainTable `json:"tables" yaml:"tables" xml:"tables"`
|
||||
Comment string `json:"comment,omitempty" yaml:"comment,omitempty" xml:"comment,omitempty"`
|
||||
Metadata map[string]any `json:"metadata,omitempty" yaml:"metadata,omitempty" xml:"-"`
|
||||
Sequence uint `json:"sequence,omitempty" yaml:"sequence,omitempty" xml:"sequence,omitempty"`
|
||||
GUID string `json:"guid" yaml:"guid" xml:"guid"`
|
||||
}
|
||||
|
||||
// SQLName returns the domain name in lowercase for SQL compatibility.
|
||||
func (d *Domain) SQLName() string {
|
||||
return strings.ToLower(d.Name)
|
||||
}
|
||||
|
||||
// DomainTable represents a reference to a specific table within a domain.
|
||||
// It identifies the table by name and schema, allowing a single domain to include
|
||||
// tables from multiple schemas.
|
||||
type DomainTable struct {
|
||||
TableName string `json:"table_name" yaml:"table_name" xml:"table_name"`
|
||||
SchemaName string `json:"schema_name" yaml:"schema_name" xml:"schema_name"`
|
||||
Sequence uint `json:"sequence,omitempty" yaml:"sequence,omitempty" xml:"sequence,omitempty"`
|
||||
RefTable *Table `json:"-" yaml:"-" xml:"-"` // Excluded to prevent circular references
|
||||
GUID string `json:"guid" yaml:"guid" xml:"guid"`
|
||||
}
|
||||
|
||||
// Schema represents a database schema, which is a logical grouping of database objects
|
||||
// such as tables, views, sequences, and relationships within a database.
|
||||
type Schema struct {
|
||||
@@ -49,6 +90,16 @@ type Schema struct {
|
||||
RefDatabase *Database `json:"-" yaml:"-" xml:"-"` // Excluded to prevent circular references
|
||||
Relations []*Relationship `json:"relations,omitempty" yaml:"relations,omitempty" xml:"-"`
|
||||
Enums []*Enum `json:"enums,omitempty" yaml:"enums,omitempty" xml:"enums"`
|
||||
UpdatedAt string `json:"updatedat,omitempty" yaml:"updatedat,omitempty" xml:"updatedat,omitempty"`
|
||||
GUID string `json:"guid" yaml:"guid" xml:"guid"`
|
||||
}
|
||||
|
||||
// UpdaUpdateDateted sets the UpdatedAt field to the current time in RFC3339 format.
|
||||
func (d *Schema) UpdateDate() {
|
||||
d.UpdatedAt = time.Now().Format(time.RFC3339)
|
||||
if d.RefDatabase != nil {
|
||||
d.RefDatabase.UpdateDate()
|
||||
}
|
||||
}
|
||||
|
||||
// SQLName returns the schema name in lowercase for SQL compatibility.
|
||||
@@ -71,6 +122,16 @@ type Table struct {
|
||||
Metadata map[string]any `json:"metadata,omitempty" yaml:"metadata,omitempty" xml:"-"`
|
||||
Sequence uint `json:"sequence,omitempty" yaml:"sequence,omitempty" xml:"sequence,omitempty"`
|
||||
RefSchema *Schema `json:"-" yaml:"-" xml:"-"` // Excluded to prevent circular references
|
||||
UpdatedAt string `json:"updatedat,omitempty" yaml:"updatedat,omitempty" xml:"updatedat,omitempty"`
|
||||
GUID string `json:"guid" yaml:"guid" xml:"guid"`
|
||||
}
|
||||
|
||||
// UpdateDate sets the UpdatedAt field to the current time in RFC3339 format.
|
||||
func (d *Table) UpdateDate() {
|
||||
d.UpdatedAt = time.Now().Format(time.RFC3339)
|
||||
if d.RefSchema != nil {
|
||||
d.RefSchema.UpdateDate()
|
||||
}
|
||||
}
|
||||
|
||||
// SQLName returns the table name in lowercase for SQL compatibility.
|
||||
@@ -111,6 +172,7 @@ type View struct {
|
||||
Metadata map[string]any `json:"metadata,omitempty" yaml:"metadata,omitempty" xml:"-"`
|
||||
Sequence uint `json:"sequence,omitempty" yaml:"sequence,omitempty" xml:"sequence,omitempty"`
|
||||
RefSchema *Schema `json:"-" yaml:"-" xml:"-"` // Excluded to prevent circular references
|
||||
GUID string `json:"guid" yaml:"guid" xml:"guid"`
|
||||
}
|
||||
|
||||
// SQLName returns the view name in lowercase for SQL compatibility.
|
||||
@@ -134,6 +196,7 @@ type Sequence struct {
|
||||
Comment string `json:"comment,omitempty" yaml:"comment,omitempty" xml:"comment,omitempty"`
|
||||
Sequence uint `json:"sequence,omitempty" yaml:"sequence,omitempty" xml:"sequence,omitempty"`
|
||||
RefSchema *Schema `json:"-" yaml:"-" xml:"-"` // Excluded to prevent circular references
|
||||
GUID string `json:"guid" yaml:"guid" xml:"guid"`
|
||||
}
|
||||
|
||||
// SQLName returns the sequence name in lowercase for SQL compatibility.
|
||||
@@ -158,6 +221,7 @@ type Column struct {
|
||||
Comment string `json:"comment,omitempty" yaml:"comment,omitempty" xml:"comment,omitempty"`
|
||||
Collation string `json:"collation,omitempty" yaml:"collation,omitempty" xml:"collation,omitempty"`
|
||||
Sequence uint `json:"sequence,omitempty" yaml:"sequence,omitempty" xml:"sequence,omitempty"`
|
||||
GUID string `json:"guid" yaml:"guid" xml:"guid"`
|
||||
}
|
||||
|
||||
// SQLName returns the column name in lowercase for SQL compatibility.
|
||||
@@ -180,6 +244,7 @@ type Index struct {
|
||||
Include []string `json:"include,omitempty" yaml:"include,omitempty" xml:"include,omitempty"` // INCLUDE columns
|
||||
Comment string `json:"comment,omitempty" yaml:"comment,omitempty" xml:"comment,omitempty"`
|
||||
Sequence uint `json:"sequence,omitempty" yaml:"sequence,omitempty" xml:"sequence,omitempty"`
|
||||
GUID string `json:"guid" yaml:"guid" xml:"guid"`
|
||||
}
|
||||
|
||||
// SQLName returns the index name in lowercase for SQL compatibility.
|
||||
@@ -214,6 +279,7 @@ type Relationship struct {
|
||||
ThroughSchema string `json:"through_schema,omitempty" yaml:"through_schema,omitempty" xml:"through_schema,omitempty"`
|
||||
Description string `json:"description,omitempty" yaml:"description,omitempty" xml:"description,omitempty"`
|
||||
Sequence uint `json:"sequence,omitempty" yaml:"sequence,omitempty" xml:"sequence,omitempty"`
|
||||
GUID string `json:"guid" yaml:"guid" xml:"guid"`
|
||||
}
|
||||
|
||||
// SQLName returns the relationship name in lowercase for SQL compatibility.
|
||||
@@ -238,6 +304,7 @@ type Constraint struct {
|
||||
Deferrable bool `json:"deferrable,omitempty" yaml:"deferrable,omitempty" xml:"deferrable,omitempty"`
|
||||
InitiallyDeferred bool `json:"initially_deferred,omitempty" yaml:"initially_deferred,omitempty" xml:"initially_deferred,omitempty"`
|
||||
Sequence uint `json:"sequence,omitempty" yaml:"sequence,omitempty" xml:"sequence,omitempty"`
|
||||
GUID string `json:"guid" yaml:"guid" xml:"guid"`
|
||||
}
|
||||
|
||||
// SQLName returns the constraint name in lowercase for SQL compatibility.
|
||||
@@ -253,6 +320,7 @@ type Enum struct {
|
||||
Name string `json:"name" yaml:"name" xml:"name"`
|
||||
Values []string `json:"values" yaml:"values" xml:"values"`
|
||||
Schema string `json:"schema,omitempty" yaml:"schema,omitempty" xml:"schema,omitempty"`
|
||||
GUID string `json:"guid" yaml:"guid" xml:"guid"`
|
||||
}
|
||||
|
||||
// SQLName returns the enum name in lowercase for SQL compatibility.
|
||||
@@ -260,6 +328,16 @@ func (d *Enum) SQLName() string {
|
||||
return strings.ToLower(d.Name)
|
||||
}
|
||||
|
||||
// InitEnum initializes a new Enum with empty values slice
|
||||
func InitEnum(name, schema string) *Enum {
|
||||
return &Enum{
|
||||
Name: name,
|
||||
Schema: schema,
|
||||
Values: make([]string, 0),
|
||||
GUID: uuid.New().String(),
|
||||
}
|
||||
}
|
||||
|
||||
// Supported constraint types.
|
||||
const (
|
||||
PrimaryKeyConstraint ConstraintType = "primary_key" // Primary key uniquely identifies each record
|
||||
@@ -281,6 +359,7 @@ type Script struct {
|
||||
Version string `json:"version,omitempty" yaml:"version,omitempty" xml:"version,omitempty"`
|
||||
Priority int `json:"priority,omitempty" yaml:"priority,omitempty" xml:"priority,omitempty"`
|
||||
Sequence uint `json:"sequence,omitempty" yaml:"sequence,omitempty" xml:"sequence,omitempty"`
|
||||
GUID string `json:"guid" yaml:"guid" xml:"guid"`
|
||||
}
|
||||
|
||||
// SQLName returns the script name in lowercase for SQL compatibility.
|
||||
@@ -295,6 +374,8 @@ func InitDatabase(name string) *Database {
|
||||
return &Database{
|
||||
Name: name,
|
||||
Schemas: make([]*Schema, 0),
|
||||
Domains: make([]*Domain, 0),
|
||||
GUID: uuid.New().String(),
|
||||
}
|
||||
}
|
||||
|
||||
@@ -308,6 +389,7 @@ func InitSchema(name string) *Schema {
|
||||
Permissions: make(map[string]string),
|
||||
Metadata: make(map[string]any),
|
||||
Scripts: make([]*Script, 0),
|
||||
GUID: uuid.New().String(),
|
||||
}
|
||||
}
|
||||
|
||||
@@ -321,6 +403,7 @@ func InitTable(name, schema string) *Table {
|
||||
Indexes: make(map[string]*Index),
|
||||
Relationships: make(map[string]*Relationship),
|
||||
Metadata: make(map[string]any),
|
||||
GUID: uuid.New().String(),
|
||||
}
|
||||
}
|
||||
|
||||
@@ -330,6 +413,7 @@ func InitColumn(name, table, schema string) *Column {
|
||||
Name: name,
|
||||
Table: table,
|
||||
Schema: schema,
|
||||
GUID: uuid.New().String(),
|
||||
}
|
||||
}
|
||||
|
||||
@@ -341,6 +425,7 @@ func InitIndex(name, table, schema string) *Index {
|
||||
Schema: schema,
|
||||
Columns: make([]string, 0),
|
||||
Include: make([]string, 0),
|
||||
GUID: uuid.New().String(),
|
||||
}
|
||||
}
|
||||
|
||||
@@ -353,6 +438,7 @@ func InitRelation(name, schema string) *Relationship {
|
||||
Properties: make(map[string]string),
|
||||
FromColumns: make([]string, 0),
|
||||
ToColumns: make([]string, 0),
|
||||
GUID: uuid.New().String(),
|
||||
}
|
||||
}
|
||||
|
||||
@@ -362,6 +448,7 @@ func InitRelationship(name string, relType RelationType) *Relationship {
|
||||
Name: name,
|
||||
Type: relType,
|
||||
Properties: make(map[string]string),
|
||||
GUID: uuid.New().String(),
|
||||
}
|
||||
}
|
||||
|
||||
@@ -372,6 +459,7 @@ func InitConstraint(name string, constraintType ConstraintType) *Constraint {
|
||||
Type: constraintType,
|
||||
Columns: make([]string, 0),
|
||||
ReferencedColumns: make([]string, 0),
|
||||
GUID: uuid.New().String(),
|
||||
}
|
||||
}
|
||||
|
||||
@@ -380,6 +468,7 @@ func InitScript(name string) *Script {
|
||||
return &Script{
|
||||
Name: name,
|
||||
RunAfter: make([]string, 0),
|
||||
GUID: uuid.New().String(),
|
||||
}
|
||||
}
|
||||
|
||||
@@ -390,6 +479,7 @@ func InitView(name, schema string) *View {
|
||||
Schema: schema,
|
||||
Columns: make(map[string]*Column),
|
||||
Metadata: make(map[string]any),
|
||||
GUID: uuid.New().String(),
|
||||
}
|
||||
}
|
||||
|
||||
@@ -400,5 +490,25 @@ func InitSequence(name, schema string) *Sequence {
|
||||
Schema: schema,
|
||||
IncrementBy: 1,
|
||||
StartValue: 1,
|
||||
GUID: uuid.New().String(),
|
||||
}
|
||||
}
|
||||
|
||||
// InitDomain initializes a new Domain with empty slices and maps
|
||||
func InitDomain(name string) *Domain {
|
||||
return &Domain{
|
||||
Name: name,
|
||||
Tables: make([]*DomainTable, 0),
|
||||
Metadata: make(map[string]any),
|
||||
GUID: uuid.New().String(),
|
||||
}
|
||||
}
|
||||
|
||||
// InitDomainTable initializes a new DomainTable reference
|
||||
func InitDomainTable(tableName, schemaName string) *DomainTable {
|
||||
return &DomainTable{
|
||||
TableName: tableName,
|
||||
SchemaName: schemaName,
|
||||
GUID: uuid.New().String(),
|
||||
}
|
||||
}
|
||||
|
||||
@@ -632,6 +632,9 @@ func (r *Reader) parseColumn(fieldName string, fieldType ast.Expr, tag string, s
|
||||
column.Name = parts[0]
|
||||
}
|
||||
|
||||
// Track if we found explicit nullability markers
|
||||
hasExplicitNullableMarker := false
|
||||
|
||||
// Parse tag attributes
|
||||
for _, part := range parts[1:] {
|
||||
kv := strings.SplitN(part, ":", 2)
|
||||
@@ -649,6 +652,10 @@ func (r *Reader) parseColumn(fieldName string, fieldType ast.Expr, tag string, s
|
||||
column.IsPrimaryKey = true
|
||||
case "notnull":
|
||||
column.NotNull = true
|
||||
hasExplicitNullableMarker = true
|
||||
case "nullzero":
|
||||
column.NotNull = false
|
||||
hasExplicitNullableMarker = true
|
||||
case "autoincrement":
|
||||
column.AutoIncrement = true
|
||||
case "default":
|
||||
@@ -664,17 +671,15 @@ func (r *Reader) parseColumn(fieldName string, fieldType ast.Expr, tag string, s
|
||||
|
||||
// Determine if nullable based on Go type and bun tags
|
||||
// In Bun:
|
||||
// - nullzero tag means the field is nullable (can be NULL in DB)
|
||||
// - absence of nullzero means the field is NOT NULL
|
||||
// - primitive types (int64, bool, string) are NOT NULL by default
|
||||
column.NotNull = true
|
||||
// Primary keys are always NOT NULL
|
||||
|
||||
if strings.Contains(bunTag, "nullzero") {
|
||||
column.NotNull = false
|
||||
} else {
|
||||
// - explicit "notnull" tag means NOT NULL
|
||||
// - explicit "nullzero" tag means nullable
|
||||
// - absence of explicit markers: infer from Go type
|
||||
if !hasExplicitNullableMarker {
|
||||
// Infer from Go type if no explicit marker found
|
||||
column.NotNull = !r.isNullableGoType(fieldType)
|
||||
}
|
||||
|
||||
// Primary keys are always NOT NULL
|
||||
if column.IsPrimaryKey {
|
||||
column.NotNull = true
|
||||
}
|
||||
|
||||
@@ -4,7 +4,9 @@ import (
|
||||
"bufio"
|
||||
"fmt"
|
||||
"os"
|
||||
"path/filepath"
|
||||
"regexp"
|
||||
"sort"
|
||||
"strings"
|
||||
|
||||
"git.warky.dev/wdevs/relspecgo/pkg/models"
|
||||
@@ -24,11 +26,23 @@ func NewReader(options *readers.ReaderOptions) *Reader {
|
||||
}
|
||||
|
||||
// ReadDatabase reads and parses DBML input, returning a Database model
|
||||
// If FilePath points to a directory, all .dbml files are loaded and merged
|
||||
func (r *Reader) ReadDatabase() (*models.Database, error) {
|
||||
if r.options.FilePath == "" {
|
||||
return nil, fmt.Errorf("file path is required for DBML reader")
|
||||
}
|
||||
|
||||
// Check if path is a directory
|
||||
info, err := os.Stat(r.options.FilePath)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("failed to stat path: %w", err)
|
||||
}
|
||||
|
||||
if info.IsDir() {
|
||||
return r.readDirectoryDBML(r.options.FilePath)
|
||||
}
|
||||
|
||||
// Single file - existing logic
|
||||
content, err := os.ReadFile(r.options.FilePath)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("failed to read file: %w", err)
|
||||
@@ -67,15 +81,301 @@ func (r *Reader) ReadTable() (*models.Table, error) {
|
||||
return schema.Tables[0], nil
|
||||
}
|
||||
|
||||
// stripQuotes removes surrounding quotes from an identifier
|
||||
// readDirectoryDBML processes all .dbml files in directory
|
||||
// Returns merged Database model
|
||||
func (r *Reader) readDirectoryDBML(dirPath string) (*models.Database, error) {
|
||||
// Discover and sort DBML files
|
||||
files, err := r.discoverDBMLFiles(dirPath)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("failed to discover DBML files: %w", err)
|
||||
}
|
||||
|
||||
// If no files found, return empty database
|
||||
if len(files) == 0 {
|
||||
db := models.InitDatabase("database")
|
||||
if r.options.Metadata != nil {
|
||||
if name, ok := r.options.Metadata["name"].(string); ok {
|
||||
db.Name = name
|
||||
}
|
||||
}
|
||||
return db, nil
|
||||
}
|
||||
|
||||
// Initialize database (will be merged with files)
|
||||
var db *models.Database
|
||||
|
||||
// Process each file in sorted order
|
||||
for _, filePath := range files {
|
||||
content, err := os.ReadFile(filePath)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("failed to read file %s: %w", filePath, err)
|
||||
}
|
||||
|
||||
fileDB, err := r.parseDBML(string(content))
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("failed to parse file %s: %w", filePath, err)
|
||||
}
|
||||
|
||||
// First file initializes the database
|
||||
if db == nil {
|
||||
db = fileDB
|
||||
} else {
|
||||
// Subsequent files are merged
|
||||
mergeDatabase(db, fileDB)
|
||||
}
|
||||
}
|
||||
|
||||
return db, nil
|
||||
}
|
||||
|
||||
// stripQuotes removes surrounding quotes and comments from an identifier
|
||||
func stripQuotes(s string) string {
|
||||
s = strings.TrimSpace(s)
|
||||
|
||||
// Remove DBML comments in brackets (e.g., [note: 'description'])
|
||||
// This handles inline comments like: "table_name" [note: 'comment']
|
||||
commentRegex := regexp.MustCompile(`\s*\[.*?\]\s*`)
|
||||
s = commentRegex.ReplaceAllString(s, "")
|
||||
|
||||
// Trim again after removing comments
|
||||
s = strings.TrimSpace(s)
|
||||
|
||||
// Remove surrounding quotes (double or single)
|
||||
if len(s) >= 2 && ((s[0] == '"' && s[len(s)-1] == '"') || (s[0] == '\'' && s[len(s)-1] == '\'')) {
|
||||
return s[1 : len(s)-1]
|
||||
}
|
||||
return s
|
||||
}
|
||||
|
||||
// parseFilePrefix extracts numeric prefix from filename
|
||||
// Examples: "1_schema.dbml" -> (1, true), "tables.dbml" -> (0, false)
|
||||
func parseFilePrefix(filename string) (int, bool) {
|
||||
base := filepath.Base(filename)
|
||||
re := regexp.MustCompile(`^(\d+)[_-]`)
|
||||
matches := re.FindStringSubmatch(base)
|
||||
if len(matches) > 1 {
|
||||
var prefix int
|
||||
_, err := fmt.Sscanf(matches[1], "%d", &prefix)
|
||||
if err == nil {
|
||||
return prefix, true
|
||||
}
|
||||
}
|
||||
return 0, false
|
||||
}
|
||||
|
||||
// hasCommentedRefs scans file content for commented-out Ref statements
|
||||
// Returns true if file contains lines like: // Ref: table.col > other.col
|
||||
func hasCommentedRefs(filePath string) (bool, error) {
|
||||
content, err := os.ReadFile(filePath)
|
||||
if err != nil {
|
||||
return false, err
|
||||
}
|
||||
|
||||
scanner := bufio.NewScanner(strings.NewReader(string(content)))
|
||||
commentedRefRegex := regexp.MustCompile(`^\s*//.*Ref:\s+`)
|
||||
|
||||
for scanner.Scan() {
|
||||
line := scanner.Text()
|
||||
if commentedRefRegex.MatchString(line) {
|
||||
return true, nil
|
||||
}
|
||||
}
|
||||
|
||||
return false, nil
|
||||
}
|
||||
|
||||
// discoverDBMLFiles finds all .dbml files in directory and returns them sorted
|
||||
func (r *Reader) discoverDBMLFiles(dirPath string) ([]string, error) {
|
||||
pattern := filepath.Join(dirPath, "*.dbml")
|
||||
files, err := filepath.Glob(pattern)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("failed to glob .dbml files: %w", err)
|
||||
}
|
||||
|
||||
return sortDBMLFiles(files), nil
|
||||
}
|
||||
|
||||
// sortDBMLFiles sorts files by:
|
||||
// 1. Files without commented refs (by numeric prefix, then alphabetically)
|
||||
// 2. Files with commented refs (by numeric prefix, then alphabetically)
|
||||
func sortDBMLFiles(files []string) []string {
|
||||
// Create a slice to hold file info for sorting
|
||||
type fileInfo struct {
|
||||
path string
|
||||
hasCommented bool
|
||||
prefix int
|
||||
hasPrefix bool
|
||||
basename string
|
||||
}
|
||||
|
||||
fileInfos := make([]fileInfo, 0, len(files))
|
||||
|
||||
for _, file := range files {
|
||||
hasCommented, err := hasCommentedRefs(file)
|
||||
if err != nil {
|
||||
// If we can't read the file, treat it as not having commented refs
|
||||
hasCommented = false
|
||||
}
|
||||
|
||||
prefix, hasPrefix := parseFilePrefix(file)
|
||||
basename := filepath.Base(file)
|
||||
|
||||
fileInfos = append(fileInfos, fileInfo{
|
||||
path: file,
|
||||
hasCommented: hasCommented,
|
||||
prefix: prefix,
|
||||
hasPrefix: hasPrefix,
|
||||
basename: basename,
|
||||
})
|
||||
}
|
||||
|
||||
// Sort by: hasCommented (false first), hasPrefix (true first), prefix, basename
|
||||
sort.Slice(fileInfos, func(i, j int) bool {
|
||||
// First, sort by commented refs (files without commented refs come first)
|
||||
if fileInfos[i].hasCommented != fileInfos[j].hasCommented {
|
||||
return !fileInfos[i].hasCommented
|
||||
}
|
||||
|
||||
// Then by presence of prefix (files with prefix come first)
|
||||
if fileInfos[i].hasPrefix != fileInfos[j].hasPrefix {
|
||||
return fileInfos[i].hasPrefix
|
||||
}
|
||||
|
||||
// If both have prefix, sort by prefix value
|
||||
if fileInfos[i].hasPrefix && fileInfos[j].hasPrefix {
|
||||
if fileInfos[i].prefix != fileInfos[j].prefix {
|
||||
return fileInfos[i].prefix < fileInfos[j].prefix
|
||||
}
|
||||
}
|
||||
|
||||
// Finally, sort alphabetically by basename
|
||||
return fileInfos[i].basename < fileInfos[j].basename
|
||||
})
|
||||
|
||||
// Extract sorted paths
|
||||
sortedFiles := make([]string, len(fileInfos))
|
||||
for i, info := range fileInfos {
|
||||
sortedFiles[i] = info.path
|
||||
}
|
||||
|
||||
return sortedFiles
|
||||
}
|
||||
|
||||
// mergeTable combines two table definitions
|
||||
// Merges: Columns (map), Constraints (map), Indexes (map), Relationships (map)
|
||||
// Uses first non-empty Description
|
||||
func mergeTable(baseTable, fileTable *models.Table) {
|
||||
// Merge columns (map naturally merges - later keys overwrite)
|
||||
for key, col := range fileTable.Columns {
|
||||
baseTable.Columns[key] = col
|
||||
}
|
||||
|
||||
// Merge constraints
|
||||
for key, constraint := range fileTable.Constraints {
|
||||
baseTable.Constraints[key] = constraint
|
||||
}
|
||||
|
||||
// Merge indexes
|
||||
for key, index := range fileTable.Indexes {
|
||||
baseTable.Indexes[key] = index
|
||||
}
|
||||
|
||||
// Merge relationships
|
||||
for key, rel := range fileTable.Relationships {
|
||||
baseTable.Relationships[key] = rel
|
||||
}
|
||||
|
||||
// Use first non-empty description
|
||||
if baseTable.Description == "" && fileTable.Description != "" {
|
||||
baseTable.Description = fileTable.Description
|
||||
}
|
||||
|
||||
// Merge metadata maps
|
||||
if baseTable.Metadata == nil {
|
||||
baseTable.Metadata = make(map[string]any)
|
||||
}
|
||||
for key, val := range fileTable.Metadata {
|
||||
baseTable.Metadata[key] = val
|
||||
}
|
||||
}
|
||||
|
||||
// mergeSchema finds or creates schema and merges tables
|
||||
func mergeSchema(baseDB *models.Database, fileSchema *models.Schema) {
|
||||
// Find existing schema by name (normalize names by stripping quotes)
|
||||
var existingSchema *models.Schema
|
||||
fileSchemaName := stripQuotes(fileSchema.Name)
|
||||
for _, schema := range baseDB.Schemas {
|
||||
if stripQuotes(schema.Name) == fileSchemaName {
|
||||
existingSchema = schema
|
||||
break
|
||||
}
|
||||
}
|
||||
|
||||
// If schema doesn't exist, add it and return
|
||||
if existingSchema == nil {
|
||||
baseDB.Schemas = append(baseDB.Schemas, fileSchema)
|
||||
return
|
||||
}
|
||||
|
||||
// Merge tables from fileSchema into existingSchema
|
||||
for _, fileTable := range fileSchema.Tables {
|
||||
// Find existing table by name (normalize names by stripping quotes)
|
||||
var existingTable *models.Table
|
||||
fileTableName := stripQuotes(fileTable.Name)
|
||||
for _, table := range existingSchema.Tables {
|
||||
if stripQuotes(table.Name) == fileTableName {
|
||||
existingTable = table
|
||||
break
|
||||
}
|
||||
}
|
||||
|
||||
// If table doesn't exist, add it
|
||||
if existingTable == nil {
|
||||
existingSchema.Tables = append(existingSchema.Tables, fileTable)
|
||||
} else {
|
||||
// Merge table properties - tables are identical, skip
|
||||
mergeTable(existingTable, fileTable)
|
||||
}
|
||||
}
|
||||
|
||||
// Merge other schema properties
|
||||
existingSchema.Views = append(existingSchema.Views, fileSchema.Views...)
|
||||
existingSchema.Sequences = append(existingSchema.Sequences, fileSchema.Sequences...)
|
||||
existingSchema.Scripts = append(existingSchema.Scripts, fileSchema.Scripts...)
|
||||
|
||||
// Merge permissions
|
||||
if existingSchema.Permissions == nil {
|
||||
existingSchema.Permissions = make(map[string]string)
|
||||
}
|
||||
for key, val := range fileSchema.Permissions {
|
||||
existingSchema.Permissions[key] = val
|
||||
}
|
||||
|
||||
// Merge metadata
|
||||
if existingSchema.Metadata == nil {
|
||||
existingSchema.Metadata = make(map[string]any)
|
||||
}
|
||||
for key, val := range fileSchema.Metadata {
|
||||
existingSchema.Metadata[key] = val
|
||||
}
|
||||
}
|
||||
|
||||
// mergeDatabase merges schemas from fileDB into baseDB
|
||||
func mergeDatabase(baseDB, fileDB *models.Database) {
|
||||
// Merge each schema from fileDB
|
||||
for _, fileSchema := range fileDB.Schemas {
|
||||
mergeSchema(baseDB, fileSchema)
|
||||
}
|
||||
|
||||
// Merge domains
|
||||
baseDB.Domains = append(baseDB.Domains, fileDB.Domains...)
|
||||
|
||||
// Use first non-empty description
|
||||
if baseDB.Description == "" && fileDB.Description != "" {
|
||||
baseDB.Description = fileDB.Description
|
||||
}
|
||||
}
|
||||
|
||||
// parseDBML parses DBML content and returns a Database model
|
||||
func (r *Reader) parseDBML(content string) (*models.Database, error) {
|
||||
db := models.InitDatabase("database")
|
||||
@@ -332,27 +632,31 @@ func (r *Reader) parseIndex(line, tableName, schemaName string) *models.Index {
|
||||
// Format: (columns) [attributes] OR columnname [attributes]
|
||||
var columns []string
|
||||
|
||||
if strings.Contains(line, "(") && strings.Contains(line, ")") {
|
||||
// Find the attributes section to avoid parsing parentheses in notes/attributes
|
||||
attrStart := strings.Index(line, "[")
|
||||
columnPart := line
|
||||
if attrStart > 0 {
|
||||
columnPart = line[:attrStart]
|
||||
}
|
||||
|
||||
if strings.Contains(columnPart, "(") && strings.Contains(columnPart, ")") {
|
||||
// Multi-column format: (col1, col2) [attributes]
|
||||
colStart := strings.Index(line, "(")
|
||||
colEnd := strings.Index(line, ")")
|
||||
colStart := strings.Index(columnPart, "(")
|
||||
colEnd := strings.Index(columnPart, ")")
|
||||
if colStart >= colEnd {
|
||||
return nil
|
||||
}
|
||||
|
||||
columnsStr := line[colStart+1 : colEnd]
|
||||
columnsStr := columnPart[colStart+1 : colEnd]
|
||||
for _, col := range strings.Split(columnsStr, ",") {
|
||||
columns = append(columns, stripQuotes(strings.TrimSpace(col)))
|
||||
}
|
||||
} else if strings.Contains(line, "[") {
|
||||
} else if attrStart > 0 {
|
||||
// Single column format: columnname [attributes]
|
||||
// Extract column name before the bracket
|
||||
idx := strings.Index(line, "[")
|
||||
if idx > 0 {
|
||||
colName := strings.TrimSpace(line[:idx])
|
||||
if colName != "" {
|
||||
columns = []string{stripQuotes(colName)}
|
||||
}
|
||||
colName := strings.TrimSpace(columnPart)
|
||||
if colName != "" {
|
||||
columns = []string{stripQuotes(colName)}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@@ -1,6 +1,7 @@
|
||||
package dbml
|
||||
|
||||
import (
|
||||
"os"
|
||||
"path/filepath"
|
||||
"testing"
|
||||
|
||||
@@ -517,3 +518,286 @@ func TestGetForeignKeys(t *testing.T) {
|
||||
t.Error("Expected foreign key constraint type")
|
||||
}
|
||||
}
|
||||
|
||||
// Tests for multi-file directory loading
|
||||
|
||||
func TestReadDirectory_MultipleFiles(t *testing.T) {
|
||||
opts := &readers.ReaderOptions{
|
||||
FilePath: filepath.Join("..", "..", "..", "tests", "assets", "dbml", "multifile"),
|
||||
}
|
||||
|
||||
reader := NewReader(opts)
|
||||
db, err := reader.ReadDatabase()
|
||||
if err != nil {
|
||||
t.Fatalf("ReadDatabase() error = %v", err)
|
||||
}
|
||||
|
||||
if db == nil {
|
||||
t.Fatal("ReadDatabase() returned nil database")
|
||||
}
|
||||
|
||||
// Should have public schema
|
||||
if len(db.Schemas) == 0 {
|
||||
t.Fatal("Expected at least one schema")
|
||||
}
|
||||
|
||||
var publicSchema *models.Schema
|
||||
for _, schema := range db.Schemas {
|
||||
if schema.Name == "public" {
|
||||
publicSchema = schema
|
||||
break
|
||||
}
|
||||
}
|
||||
|
||||
if publicSchema == nil {
|
||||
t.Fatal("Public schema not found")
|
||||
}
|
||||
|
||||
// Should have 3 tables: users, posts, comments
|
||||
if len(publicSchema.Tables) != 3 {
|
||||
t.Fatalf("Expected 3 tables, got %d", len(publicSchema.Tables))
|
||||
}
|
||||
|
||||
// Find tables
|
||||
var usersTable, postsTable, commentsTable *models.Table
|
||||
for _, table := range publicSchema.Tables {
|
||||
switch table.Name {
|
||||
case "users":
|
||||
usersTable = table
|
||||
case "posts":
|
||||
postsTable = table
|
||||
case "comments":
|
||||
commentsTable = table
|
||||
}
|
||||
}
|
||||
|
||||
if usersTable == nil {
|
||||
t.Fatal("Users table not found")
|
||||
}
|
||||
if postsTable == nil {
|
||||
t.Fatal("Posts table not found")
|
||||
}
|
||||
if commentsTable == nil {
|
||||
t.Fatal("Comments table not found")
|
||||
}
|
||||
|
||||
// Verify users table has merged columns from 1_users.dbml and 3_add_columns.dbml
|
||||
expectedUserColumns := []string{"id", "email", "name", "created_at"}
|
||||
if len(usersTable.Columns) != len(expectedUserColumns) {
|
||||
t.Errorf("Expected %d columns in users table, got %d", len(expectedUserColumns), len(usersTable.Columns))
|
||||
}
|
||||
|
||||
for _, colName := range expectedUserColumns {
|
||||
if _, exists := usersTable.Columns[colName]; !exists {
|
||||
t.Errorf("Expected column '%s' in users table", colName)
|
||||
}
|
||||
}
|
||||
|
||||
// Verify posts table columns
|
||||
expectedPostColumns := []string{"id", "user_id", "title", "content", "created_at"}
|
||||
for _, colName := range expectedPostColumns {
|
||||
if _, exists := postsTable.Columns[colName]; !exists {
|
||||
t.Errorf("Expected column '%s' in posts table", colName)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
func TestReadDirectory_TableMerging(t *testing.T) {
|
||||
opts := &readers.ReaderOptions{
|
||||
FilePath: filepath.Join("..", "..", "..", "tests", "assets", "dbml", "multifile"),
|
||||
}
|
||||
|
||||
reader := NewReader(opts)
|
||||
db, err := reader.ReadDatabase()
|
||||
if err != nil {
|
||||
t.Fatalf("ReadDatabase() error = %v", err)
|
||||
}
|
||||
|
||||
// Find users table
|
||||
var usersTable *models.Table
|
||||
for _, schema := range db.Schemas {
|
||||
for _, table := range schema.Tables {
|
||||
if table.Name == "users" && schema.Name == "public" {
|
||||
usersTable = table
|
||||
break
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if usersTable == nil {
|
||||
t.Fatal("Users table not found")
|
||||
}
|
||||
|
||||
// Verify columns from file 1 (id, email)
|
||||
if _, exists := usersTable.Columns["id"]; !exists {
|
||||
t.Error("Column 'id' from 1_users.dbml not found")
|
||||
}
|
||||
if _, exists := usersTable.Columns["email"]; !exists {
|
||||
t.Error("Column 'email' from 1_users.dbml not found")
|
||||
}
|
||||
|
||||
// Verify columns from file 3 (name, created_at)
|
||||
if _, exists := usersTable.Columns["name"]; !exists {
|
||||
t.Error("Column 'name' from 3_add_columns.dbml not found")
|
||||
}
|
||||
if _, exists := usersTable.Columns["created_at"]; !exists {
|
||||
t.Error("Column 'created_at' from 3_add_columns.dbml not found")
|
||||
}
|
||||
|
||||
// Verify column properties from file 1
|
||||
emailCol := usersTable.Columns["email"]
|
||||
if !emailCol.NotNull {
|
||||
t.Error("Email column should be not null (from 1_users.dbml)")
|
||||
}
|
||||
if emailCol.Type != "varchar(255)" {
|
||||
t.Errorf("Expected email type 'varchar(255)', got '%s'", emailCol.Type)
|
||||
}
|
||||
}
|
||||
|
||||
func TestReadDirectory_CommentedRefsLast(t *testing.T) {
|
||||
// This test verifies that files with commented refs are processed last
|
||||
// by checking that the file discovery returns them in the correct order
|
||||
dirPath := filepath.Join("..", "..", "..", "tests", "assets", "dbml", "multifile")
|
||||
|
||||
opts := &readers.ReaderOptions{
|
||||
FilePath: dirPath,
|
||||
}
|
||||
|
||||
reader := NewReader(opts)
|
||||
files, err := reader.discoverDBMLFiles(dirPath)
|
||||
if err != nil {
|
||||
t.Fatalf("discoverDBMLFiles() error = %v", err)
|
||||
}
|
||||
|
||||
if len(files) < 2 {
|
||||
t.Skip("Not enough files to test ordering")
|
||||
}
|
||||
|
||||
// Check that 9_refs.dbml (which has commented refs) comes last
|
||||
lastFile := filepath.Base(files[len(files)-1])
|
||||
if lastFile != "9_refs.dbml" {
|
||||
t.Errorf("Expected last file to be '9_refs.dbml' (has commented refs), got '%s'", lastFile)
|
||||
}
|
||||
|
||||
// Check that numbered files without commented refs come first
|
||||
firstFile := filepath.Base(files[0])
|
||||
if firstFile != "1_users.dbml" {
|
||||
t.Errorf("Expected first file to be '1_users.dbml', got '%s'", firstFile)
|
||||
}
|
||||
}
|
||||
|
||||
func TestReadDirectory_EmptyDirectory(t *testing.T) {
|
||||
// Create a temporary empty directory
|
||||
tmpDir := filepath.Join("..", "..", "..", "tests", "assets", "dbml", "empty_test_dir")
|
||||
err := os.MkdirAll(tmpDir, 0755)
|
||||
if err != nil {
|
||||
t.Fatalf("Failed to create temp directory: %v", err)
|
||||
}
|
||||
defer os.RemoveAll(tmpDir)
|
||||
|
||||
opts := &readers.ReaderOptions{
|
||||
FilePath: tmpDir,
|
||||
}
|
||||
|
||||
reader := NewReader(opts)
|
||||
db, err := reader.ReadDatabase()
|
||||
if err != nil {
|
||||
t.Fatalf("ReadDatabase() should not error on empty directory, got: %v", err)
|
||||
}
|
||||
|
||||
if db == nil {
|
||||
t.Fatal("ReadDatabase() returned nil database")
|
||||
}
|
||||
|
||||
// Empty directory should return empty database
|
||||
if len(db.Schemas) != 0 {
|
||||
t.Errorf("Expected 0 schemas for empty directory, got %d", len(db.Schemas))
|
||||
}
|
||||
}
|
||||
|
||||
func TestReadDatabase_BackwardCompat(t *testing.T) {
|
||||
// Test that single file loading still works
|
||||
opts := &readers.ReaderOptions{
|
||||
FilePath: filepath.Join("..", "..", "..", "tests", "assets", "dbml", "simple.dbml"),
|
||||
}
|
||||
|
||||
reader := NewReader(opts)
|
||||
db, err := reader.ReadDatabase()
|
||||
if err != nil {
|
||||
t.Fatalf("ReadDatabase() error = %v", err)
|
||||
}
|
||||
|
||||
if db == nil {
|
||||
t.Fatal("ReadDatabase() returned nil database")
|
||||
}
|
||||
|
||||
if len(db.Schemas) == 0 {
|
||||
t.Fatal("Expected at least one schema")
|
||||
}
|
||||
|
||||
schema := db.Schemas[0]
|
||||
if len(schema.Tables) != 1 {
|
||||
t.Fatalf("Expected 1 table, got %d", len(schema.Tables))
|
||||
}
|
||||
|
||||
table := schema.Tables[0]
|
||||
if table.Name != "users" {
|
||||
t.Errorf("Expected table name 'users', got '%s'", table.Name)
|
||||
}
|
||||
}
|
||||
|
||||
func TestParseFilePrefix(t *testing.T) {
|
||||
tests := []struct {
|
||||
filename string
|
||||
wantPrefix int
|
||||
wantHas bool
|
||||
}{
|
||||
{"1_schema.dbml", 1, true},
|
||||
{"2_tables.dbml", 2, true},
|
||||
{"10_relationships.dbml", 10, true},
|
||||
{"99_data.dbml", 99, true},
|
||||
{"schema.dbml", 0, false},
|
||||
{"tables_no_prefix.dbml", 0, false},
|
||||
{"/path/to/1_file.dbml", 1, true},
|
||||
{"/path/to/file.dbml", 0, false},
|
||||
{"1-file.dbml", 1, true},
|
||||
{"2-another.dbml", 2, true},
|
||||
}
|
||||
|
||||
for _, tt := range tests {
|
||||
t.Run(tt.filename, func(t *testing.T) {
|
||||
gotPrefix, gotHas := parseFilePrefix(tt.filename)
|
||||
if gotPrefix != tt.wantPrefix {
|
||||
t.Errorf("parseFilePrefix(%s) prefix = %d, want %d", tt.filename, gotPrefix, tt.wantPrefix)
|
||||
}
|
||||
if gotHas != tt.wantHas {
|
||||
t.Errorf("parseFilePrefix(%s) hasPrefix = %v, want %v", tt.filename, gotHas, tt.wantHas)
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
func TestHasCommentedRefs(t *testing.T) {
|
||||
// Test with the actual multifile test fixtures
|
||||
tests := []struct {
|
||||
filename string
|
||||
wantHas bool
|
||||
}{
|
||||
{filepath.Join("..", "..", "..", "tests", "assets", "dbml", "multifile", "1_users.dbml"), false},
|
||||
{filepath.Join("..", "..", "..", "tests", "assets", "dbml", "multifile", "2_posts.dbml"), false},
|
||||
{filepath.Join("..", "..", "..", "tests", "assets", "dbml", "multifile", "3_add_columns.dbml"), false},
|
||||
{filepath.Join("..", "..", "..", "tests", "assets", "dbml", "multifile", "9_refs.dbml"), true},
|
||||
}
|
||||
|
||||
for _, tt := range tests {
|
||||
t.Run(filepath.Base(tt.filename), func(t *testing.T) {
|
||||
gotHas, err := hasCommentedRefs(tt.filename)
|
||||
if err != nil {
|
||||
t.Fatalf("hasCommentedRefs() error = %v", err)
|
||||
}
|
||||
if gotHas != tt.wantHas {
|
||||
t.Errorf("hasCommentedRefs(%s) = %v, want %v", filepath.Base(tt.filename), gotHas, tt.wantHas)
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
@@ -79,6 +79,8 @@ func (r *Reader) convertToDatabase(dctx *models.DCTXDictionary) (*models.Databas
|
||||
db := models.InitDatabase(dbName)
|
||||
schema := models.InitSchema("public")
|
||||
|
||||
// Note: DCTX doesn't have database GUID, but schema can use dictionary name if available
|
||||
|
||||
// Create GUID mappings for tables and keys
|
||||
tableGuidMap := make(map[string]string) // GUID -> table name
|
||||
keyGuidMap := make(map[string]*models.DCTXKey) // GUID -> key definition
|
||||
@@ -162,6 +164,10 @@ func (r *Reader) convertTable(dctxTable *models.DCTXTable) (*models.Table, map[s
|
||||
tableName := r.sanitizeName(dctxTable.Name)
|
||||
table := models.InitTable(tableName, "public")
|
||||
table.Description = dctxTable.Description
|
||||
// Assign GUID from DCTX table
|
||||
if dctxTable.Guid != "" {
|
||||
table.GUID = dctxTable.Guid
|
||||
}
|
||||
|
||||
fieldGuidMap := make(map[string]string)
|
||||
|
||||
@@ -202,6 +208,10 @@ func (r *Reader) convertField(dctxField *models.DCTXField, tableName string) ([]
|
||||
|
||||
// Convert single field
|
||||
column := models.InitColumn(r.sanitizeName(dctxField.Name), tableName, "public")
|
||||
// Assign GUID from DCTX field
|
||||
if dctxField.Guid != "" {
|
||||
column.GUID = dctxField.Guid
|
||||
}
|
||||
|
||||
// Map Clarion data types
|
||||
dataType, length := r.mapDataType(dctxField.DataType, dctxField.Size)
|
||||
@@ -346,6 +356,10 @@ func (r *Reader) convertKey(dctxKey *models.DCTXKey, table *models.Table, fieldG
|
||||
constraint.Table = table.Name
|
||||
constraint.Schema = table.Schema
|
||||
constraint.Columns = columns
|
||||
// Assign GUID from DCTX key
|
||||
if dctxKey.Guid != "" {
|
||||
constraint.GUID = dctxKey.Guid
|
||||
}
|
||||
|
||||
table.Constraints[constraint.Name] = constraint
|
||||
|
||||
@@ -366,6 +380,10 @@ func (r *Reader) convertKey(dctxKey *models.DCTXKey, table *models.Table, fieldG
|
||||
index.Columns = columns
|
||||
index.Unique = dctxKey.Unique
|
||||
index.Type = "btree"
|
||||
// Assign GUID from DCTX key
|
||||
if dctxKey.Guid != "" {
|
||||
index.GUID = dctxKey.Guid
|
||||
}
|
||||
|
||||
table.Indexes[index.Name] = index
|
||||
return nil
|
||||
@@ -460,6 +478,10 @@ func (r *Reader) processRelations(dctx *models.DCTXDictionary, schema *models.Sc
|
||||
constraint.ReferencedColumns = pkColumns
|
||||
constraint.OnDelete = r.mapReferentialAction(relation.Delete)
|
||||
constraint.OnUpdate = r.mapReferentialAction(relation.Update)
|
||||
// Assign GUID from DCTX relation
|
||||
if relation.Guid != "" {
|
||||
constraint.GUID = relation.Guid
|
||||
}
|
||||
|
||||
foreignTable.Constraints[fkName] = constraint
|
||||
|
||||
@@ -473,6 +495,10 @@ func (r *Reader) processRelations(dctx *models.DCTXDictionary, schema *models.Sc
|
||||
relationship.ForeignKey = fkName
|
||||
relationship.Properties["on_delete"] = constraint.OnDelete
|
||||
relationship.Properties["on_update"] = constraint.OnUpdate
|
||||
// Assign GUID from DCTX relation
|
||||
if relation.Guid != "" {
|
||||
relationship.GUID = relation.Guid
|
||||
}
|
||||
|
||||
foreignTable.Relationships[relationshipName] = relationship
|
||||
}
|
||||
|
||||
@@ -140,6 +140,32 @@ func (r *Reader) convertToDatabase(drawSchema *drawdb.DrawDBSchema) (*models.Dat
|
||||
db.Schemas = append(db.Schemas, schema)
|
||||
}
|
||||
|
||||
// Convert DrawDB subject areas to domains
|
||||
for _, area := range drawSchema.SubjectAreas {
|
||||
domain := models.InitDomain(area.Name)
|
||||
|
||||
// Find all tables that visually belong to this area
|
||||
// A table belongs to an area if its position is within the area bounds
|
||||
for _, drawTable := range drawSchema.Tables {
|
||||
if drawTable.X >= area.X && drawTable.X <= (area.X+area.Width) &&
|
||||
drawTable.Y >= area.Y && drawTable.Y <= (area.Y+area.Height) {
|
||||
|
||||
schemaName := drawTable.Schema
|
||||
if schemaName == "" {
|
||||
schemaName = "public"
|
||||
}
|
||||
|
||||
domainTable := models.InitDomainTable(drawTable.Name, schemaName)
|
||||
domain.Tables = append(domain.Tables, domainTable)
|
||||
}
|
||||
}
|
||||
|
||||
// Only add domain if it has tables
|
||||
if len(domain.Tables) > 0 {
|
||||
db.Domains = append(db.Domains, domain)
|
||||
}
|
||||
}
|
||||
|
||||
return db, nil
|
||||
}
|
||||
|
||||
|
||||
@@ -241,11 +241,9 @@ func (r *Reader) parsePgEnum(line string, matches []string) *models.Enum {
|
||||
}
|
||||
}
|
||||
|
||||
return &models.Enum{
|
||||
Name: enumName,
|
||||
Values: values,
|
||||
Schema: "public",
|
||||
}
|
||||
enum := models.InitEnum(enumName, "public")
|
||||
enum.Values = values
|
||||
return enum
|
||||
}
|
||||
|
||||
// parseTableBlock parses a complete pgTable definition block
|
||||
|
||||
@@ -260,11 +260,7 @@ func (r *Reader) parseType(typeName string, lines []string, schema *models.Schem
|
||||
}
|
||||
|
||||
func (r *Reader) parseEnum(enumName string, lines []string, schema *models.Schema) {
|
||||
enum := &models.Enum{
|
||||
Name: enumName,
|
||||
Schema: schema.Name,
|
||||
Values: make([]string, 0),
|
||||
}
|
||||
enum := models.InitEnum(enumName, schema.Name)
|
||||
|
||||
for _, line := range lines {
|
||||
trimmed := strings.TrimSpace(line)
|
||||
|
||||
@@ -128,11 +128,7 @@ func (r *Reader) parsePrisma(content string) (*models.Database, error) {
|
||||
if matches := enumRegex.FindStringSubmatch(trimmed); matches != nil {
|
||||
currentBlock = "enum"
|
||||
enumName := matches[1]
|
||||
currentEnum = &models.Enum{
|
||||
Name: enumName,
|
||||
Schema: "public",
|
||||
Values: make([]string, 0),
|
||||
}
|
||||
currentEnum = models.InitEnum(enumName, "public")
|
||||
blockContent = []string{}
|
||||
continue
|
||||
}
|
||||
|
||||
@@ -150,13 +150,11 @@ func (r *Reader) readScripts() ([]*models.Script, error) {
|
||||
}
|
||||
|
||||
// Create Script model
|
||||
script := &models.Script{
|
||||
Name: name,
|
||||
Description: fmt.Sprintf("SQL script from %s", relPath),
|
||||
SQL: string(content),
|
||||
Priority: priority,
|
||||
Sequence: uint(sequence),
|
||||
}
|
||||
script := models.InitScript(name)
|
||||
script.Description = fmt.Sprintf("SQL script from %s", relPath)
|
||||
script.SQL = string(content)
|
||||
script.Priority = priority
|
||||
script.Sequence = uint(sequence)
|
||||
|
||||
scripts = append(scripts, script)
|
||||
|
||||
|
||||
95
pkg/ui/column_dataops.go
Normal file
95
pkg/ui/column_dataops.go
Normal file
@@ -0,0 +1,95 @@
|
||||
package ui
|
||||
|
||||
import "git.warky.dev/wdevs/relspecgo/pkg/models"
|
||||
|
||||
// Column data operations - business logic for column management
|
||||
|
||||
// CreateColumn creates a new column and adds it to a table
|
||||
func (se *SchemaEditor) CreateColumn(schemaIndex, tableIndex int, name, dataType string, isPrimaryKey, isNotNull bool) *models.Column {
|
||||
table := se.GetTable(schemaIndex, tableIndex)
|
||||
if table == nil {
|
||||
return nil
|
||||
}
|
||||
|
||||
if table.Columns == nil {
|
||||
table.Columns = make(map[string]*models.Column)
|
||||
}
|
||||
|
||||
newColumn := &models.Column{
|
||||
Name: name,
|
||||
Type: dataType,
|
||||
IsPrimaryKey: isPrimaryKey,
|
||||
NotNull: isNotNull,
|
||||
}
|
||||
table.UpdateDate()
|
||||
table.Columns[name] = newColumn
|
||||
return newColumn
|
||||
}
|
||||
|
||||
// UpdateColumn updates an existing column's properties
|
||||
func (se *SchemaEditor) UpdateColumn(schemaIndex, tableIndex int, oldName, newName, dataType string, isPrimaryKey, isNotNull bool, defaultValue interface{}, description string) bool {
|
||||
table := se.GetTable(schemaIndex, tableIndex)
|
||||
if table == nil {
|
||||
return false
|
||||
}
|
||||
|
||||
column, exists := table.Columns[oldName]
|
||||
if !exists {
|
||||
return false
|
||||
}
|
||||
|
||||
table.UpdateDate()
|
||||
|
||||
// If name changed, remove old entry and create new one
|
||||
if oldName != newName {
|
||||
delete(table.Columns, oldName)
|
||||
column.Name = newName
|
||||
table.Columns[newName] = column
|
||||
}
|
||||
|
||||
// Update properties
|
||||
column.Type = dataType
|
||||
column.IsPrimaryKey = isPrimaryKey
|
||||
column.NotNull = isNotNull
|
||||
column.Default = defaultValue
|
||||
column.Description = description
|
||||
|
||||
return true
|
||||
}
|
||||
|
||||
// DeleteColumn removes a column from a table
|
||||
func (se *SchemaEditor) DeleteColumn(schemaIndex, tableIndex int, columnName string) bool {
|
||||
table := se.GetTable(schemaIndex, tableIndex)
|
||||
if table == nil {
|
||||
return false
|
||||
}
|
||||
|
||||
if _, exists := table.Columns[columnName]; !exists {
|
||||
return false
|
||||
}
|
||||
|
||||
table.UpdateDate()
|
||||
|
||||
delete(table.Columns, columnName)
|
||||
return true
|
||||
}
|
||||
|
||||
// GetColumn returns a column by name
|
||||
func (se *SchemaEditor) GetColumn(schemaIndex, tableIndex int, columnName string) *models.Column {
|
||||
table := se.GetTable(schemaIndex, tableIndex)
|
||||
if table == nil {
|
||||
return nil
|
||||
}
|
||||
|
||||
return table.Columns[columnName]
|
||||
}
|
||||
|
||||
// GetAllColumns returns all columns in a table
|
||||
func (se *SchemaEditor) GetAllColumns(schemaIndex, tableIndex int) map[string]*models.Column {
|
||||
table := se.GetTable(schemaIndex, tableIndex)
|
||||
if table == nil {
|
||||
return nil
|
||||
}
|
||||
|
||||
return table.Columns
|
||||
}
|
||||
214
pkg/ui/column_screens.go
Normal file
214
pkg/ui/column_screens.go
Normal file
@@ -0,0 +1,214 @@
|
||||
package ui
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
|
||||
"github.com/gdamore/tcell/v2"
|
||||
"github.com/rivo/tview"
|
||||
|
||||
"git.warky.dev/wdevs/relspecgo/pkg/models"
|
||||
)
|
||||
|
||||
// showColumnEditor shows editor for a specific column
|
||||
func (se *SchemaEditor) showColumnEditor(schemaIndex, tableIndex, colIndex int, column *models.Column) {
|
||||
form := tview.NewForm()
|
||||
|
||||
// Store original name to handle renames
|
||||
originalName := column.Name
|
||||
|
||||
// Local variables to collect changes
|
||||
newName := column.Name
|
||||
newType := column.Type
|
||||
newIsPK := column.IsPrimaryKey
|
||||
newIsNotNull := column.NotNull
|
||||
newDefault := column.Default
|
||||
newDescription := column.Description
|
||||
newGUID := column.GUID
|
||||
|
||||
// Column type options: PostgreSQL, MySQL, SQL Server, and common SQL types
|
||||
columnTypes := []string{
|
||||
// Numeric Types
|
||||
"SMALLINT", "INTEGER", "BIGINT", "INT", "TINYINT", "FLOAT", "REAL", "DOUBLE PRECISION",
|
||||
"DECIMAL(10,2)", "NUMERIC", "DECIMAL", "NUMERIC(10,2)",
|
||||
// Character Types
|
||||
"CHAR", "VARCHAR", "VARCHAR(255)", "TEXT", "NCHAR", "NVARCHAR", "NVARCHAR(255)",
|
||||
// Boolean
|
||||
"BOOLEAN", "BOOL", "BIT",
|
||||
// Date/Time Types
|
||||
"DATE", "TIME", "TIMESTAMP", "TIMESTAMP WITH TIME ZONE", "INTERVAL",
|
||||
"DATETIME", "DATETIME2", "DATEFIRST",
|
||||
// UUID and JSON
|
||||
"UUID", "GUID", "JSON", "JSONB",
|
||||
// Binary Types
|
||||
"BYTEA", "BLOB", "IMAGE", "VARBINARY", "VARBINARY(MAX)", "BINARY",
|
||||
// PostgreSQL Special Types
|
||||
"int4range", "int8range", "numrange", "tsrange", "tstzrange", "daterange",
|
||||
"HSTORE", "CITEXT", "INET", "MACADDR", "POINT", "LINE", "LSEG", "BOX", "PATH", "POLYGON", "CIRCLE",
|
||||
// Array Types
|
||||
"INTEGER ARRAY", "VARCHAR ARRAY", "TEXT ARRAY", "BIGINT ARRAY",
|
||||
// MySQL Specific
|
||||
"MEDIUMINT", "DOUBLE", "FLOAT(10,2)",
|
||||
// SQL Server Specific
|
||||
"MONEY", "SMALLMONEY", "SQL_VARIANT",
|
||||
}
|
||||
selectedTypeIndex := 0
|
||||
|
||||
// Add existing type if not already in the list
|
||||
typeExists := false
|
||||
for i, opt := range columnTypes {
|
||||
if opt == column.Type {
|
||||
selectedTypeIndex = i
|
||||
typeExists = true
|
||||
break
|
||||
}
|
||||
}
|
||||
if !typeExists && column.Type != "" {
|
||||
columnTypes = append(columnTypes, column.Type)
|
||||
selectedTypeIndex = len(columnTypes) - 1
|
||||
}
|
||||
|
||||
form.AddInputField("Column Name", column.Name, 40, nil, func(value string) {
|
||||
newName = value
|
||||
})
|
||||
|
||||
form.AddDropDown("Type", columnTypes, selectedTypeIndex, func(option string, index int) {
|
||||
newType = option
|
||||
})
|
||||
|
||||
form.AddCheckbox("Primary Key", column.IsPrimaryKey, func(checked bool) {
|
||||
newIsPK = checked
|
||||
})
|
||||
|
||||
form.AddCheckbox("Not Null", column.NotNull, func(checked bool) {
|
||||
newIsNotNull = checked
|
||||
})
|
||||
|
||||
defaultStr := ""
|
||||
if column.Default != nil {
|
||||
defaultStr = fmt.Sprintf("%v", column.Default)
|
||||
}
|
||||
form.AddInputField("Default Value", defaultStr, 40, nil, func(value string) {
|
||||
newDefault = value
|
||||
})
|
||||
|
||||
form.AddTextArea("Description", column.Description, 40, 5, 0, func(value string) {
|
||||
newDescription = value
|
||||
})
|
||||
|
||||
form.AddInputField("GUID", column.GUID, 40, nil, func(value string) {
|
||||
newGUID = value
|
||||
})
|
||||
|
||||
form.AddButton("Save", func() {
|
||||
// Apply changes using dataops
|
||||
se.UpdateColumn(schemaIndex, tableIndex, originalName, newName, newType, newIsPK, newIsNotNull, newDefault, newDescription)
|
||||
se.db.Schemas[schemaIndex].Tables[tableIndex].Columns[newName].GUID = newGUID
|
||||
|
||||
se.pages.RemovePage("column-editor")
|
||||
se.pages.SwitchToPage("table-editor")
|
||||
})
|
||||
|
||||
form.AddButton("Delete", func() {
|
||||
se.showDeleteColumnConfirm(schemaIndex, tableIndex, originalName)
|
||||
})
|
||||
|
||||
form.AddButton("Back", func() {
|
||||
// Discard changes - don't apply them
|
||||
se.pages.RemovePage("column-editor")
|
||||
se.pages.SwitchToPage("table-editor")
|
||||
})
|
||||
|
||||
form.SetBorder(true).SetTitle(" Edit Column ").SetTitleAlign(tview.AlignLeft)
|
||||
form.SetInputCapture(func(event *tcell.EventKey) *tcell.EventKey {
|
||||
if event.Key() == tcell.KeyEscape {
|
||||
se.showExitConfirmation("column-editor", "table-editor")
|
||||
return nil
|
||||
}
|
||||
return event
|
||||
})
|
||||
|
||||
se.pages.AddPage("column-editor", form, true, true)
|
||||
}
|
||||
|
||||
// showNewColumnDialog shows dialog to create a new column
|
||||
func (se *SchemaEditor) showNewColumnDialog(schemaIndex, tableIndex int) {
|
||||
form := tview.NewForm()
|
||||
|
||||
columnName := ""
|
||||
dataType := "VARCHAR(255)"
|
||||
|
||||
// Column type options: PostgreSQL, MySQL, SQL Server, and common SQL types
|
||||
columnTypes := []string{
|
||||
// Numeric Types
|
||||
"SMALLINT", "INTEGER", "BIGINT", "INT", "TINYINT", "FLOAT", "REAL", "DOUBLE PRECISION",
|
||||
"DECIMAL(10,2)", "NUMERIC", "DECIMAL", "NUMERIC(10,2)",
|
||||
// Character Types
|
||||
"CHAR", "VARCHAR", "VARCHAR(255)", "TEXT", "NCHAR", "NVARCHAR", "NVARCHAR(255)",
|
||||
// Boolean
|
||||
"BOOLEAN", "BOOL", "BIT",
|
||||
// Date/Time Types
|
||||
"DATE", "TIME", "TIMESTAMP", "TIMESTAMP WITH TIME ZONE", "INTERVAL",
|
||||
"DATETIME", "DATETIME2", "DATEFIRST",
|
||||
// UUID and JSON
|
||||
"UUID", "GUID", "JSON", "JSONB",
|
||||
// Binary Types
|
||||
"BYTEA", "BLOB", "IMAGE", "VARBINARY", "VARBINARY(MAX)", "BINARY",
|
||||
// PostgreSQL Special Types
|
||||
"int4range", "int8range", "numrange", "tsrange", "tstzrange", "daterange",
|
||||
"HSTORE", "CITEXT", "INET", "MACADDR", "POINT", "LINE", "LSEG", "BOX", "PATH", "POLYGON", "CIRCLE",
|
||||
// Array Types
|
||||
"INTEGER ARRAY", "VARCHAR ARRAY", "TEXT ARRAY", "BIGINT ARRAY",
|
||||
// MySQL Specific
|
||||
"MEDIUMINT", "DOUBLE", "FLOAT(10,2)",
|
||||
// SQL Server Specific
|
||||
"MONEY", "SMALLMONEY", "SQL_VARIANT",
|
||||
}
|
||||
selectedTypeIndex := 0
|
||||
|
||||
form.AddInputField("Column Name", "", 40, nil, func(value string) {
|
||||
columnName = value
|
||||
})
|
||||
|
||||
form.AddDropDown("Data Type", columnTypes, selectedTypeIndex, func(option string, index int) {
|
||||
dataType = option
|
||||
})
|
||||
|
||||
form.AddCheckbox("Primary Key", false, nil)
|
||||
form.AddCheckbox("Not Null", false, nil)
|
||||
form.AddCheckbox("Unique", false, nil)
|
||||
|
||||
form.AddButton("Save", func() {
|
||||
if columnName == "" {
|
||||
return
|
||||
}
|
||||
|
||||
// Get form values
|
||||
isPK := form.GetFormItemByLabel("Primary Key").(*tview.Checkbox).IsChecked()
|
||||
isNotNull := form.GetFormItemByLabel("Not Null").(*tview.Checkbox).IsChecked()
|
||||
|
||||
se.CreateColumn(schemaIndex, tableIndex, columnName, dataType, isPK, isNotNull)
|
||||
|
||||
table := se.db.Schemas[schemaIndex].Tables[tableIndex]
|
||||
se.pages.RemovePage("new-column")
|
||||
se.pages.RemovePage("table-editor")
|
||||
se.showTableEditor(schemaIndex, tableIndex, table)
|
||||
})
|
||||
|
||||
form.AddButton("Back", func() {
|
||||
table := se.db.Schemas[schemaIndex].Tables[tableIndex]
|
||||
se.pages.RemovePage("new-column")
|
||||
se.pages.RemovePage("table-editor")
|
||||
se.showTableEditor(schemaIndex, tableIndex, table)
|
||||
})
|
||||
|
||||
form.SetBorder(true).SetTitle(" New Column ").SetTitleAlign(tview.AlignLeft)
|
||||
form.SetInputCapture(func(event *tcell.EventKey) *tcell.EventKey {
|
||||
if event.Key() == tcell.KeyEscape {
|
||||
se.showExitConfirmation("new-column", "table-editor")
|
||||
return nil
|
||||
}
|
||||
return event
|
||||
})
|
||||
|
||||
se.pages.AddPage("new-column", form, true, true)
|
||||
}
|
||||
15
pkg/ui/database_dataops.go
Normal file
15
pkg/ui/database_dataops.go
Normal file
@@ -0,0 +1,15 @@
|
||||
package ui
|
||||
|
||||
import (
|
||||
"git.warky.dev/wdevs/relspecgo/pkg/models"
|
||||
)
|
||||
|
||||
// updateDatabase updates database properties
|
||||
func (se *SchemaEditor) updateDatabase(name, description, comment, dbType, dbVersion string) {
|
||||
se.db.Name = name
|
||||
se.db.Description = description
|
||||
se.db.Comment = comment
|
||||
se.db.DatabaseType = models.DatabaseType(dbType)
|
||||
se.db.DatabaseVersion = dbVersion
|
||||
se.db.UpdateDate()
|
||||
}
|
||||
78
pkg/ui/database_screens.go
Normal file
78
pkg/ui/database_screens.go
Normal file
@@ -0,0 +1,78 @@
|
||||
package ui
|
||||
|
||||
import (
|
||||
"github.com/gdamore/tcell/v2"
|
||||
"github.com/rivo/tview"
|
||||
)
|
||||
|
||||
// showEditDatabaseForm displays a dialog to edit database properties
|
||||
func (se *SchemaEditor) showEditDatabaseForm() {
|
||||
form := tview.NewForm()
|
||||
|
||||
dbName := se.db.Name
|
||||
dbDescription := se.db.Description
|
||||
dbComment := se.db.Comment
|
||||
dbType := string(se.db.DatabaseType)
|
||||
dbVersion := se.db.DatabaseVersion
|
||||
dbGUID := se.db.GUID
|
||||
|
||||
// Database type options
|
||||
dbTypeOptions := []string{"pgsql", "mssql", "sqlite"}
|
||||
selectedTypeIndex := 0
|
||||
for i, opt := range dbTypeOptions {
|
||||
if opt == dbType {
|
||||
selectedTypeIndex = i
|
||||
break
|
||||
}
|
||||
}
|
||||
|
||||
form.AddInputField("Database Name", dbName, 40, nil, func(value string) {
|
||||
dbName = value
|
||||
})
|
||||
|
||||
form.AddInputField("Description", dbDescription, 50, nil, func(value string) {
|
||||
dbDescription = value
|
||||
})
|
||||
|
||||
form.AddInputField("Comment", dbComment, 50, nil, func(value string) {
|
||||
dbComment = value
|
||||
})
|
||||
|
||||
form.AddDropDown("Database Type", dbTypeOptions, selectedTypeIndex, func(option string, index int) {
|
||||
dbType = option
|
||||
})
|
||||
|
||||
form.AddInputField("Database Version", dbVersion, 20, nil, func(value string) {
|
||||
dbVersion = value
|
||||
})
|
||||
|
||||
form.AddInputField("GUID", dbGUID, 40, nil, func(value string) {
|
||||
dbGUID = value
|
||||
})
|
||||
|
||||
form.AddButton("Save", func() {
|
||||
if dbName == "" {
|
||||
return
|
||||
}
|
||||
se.updateDatabase(dbName, dbDescription, dbComment, dbType, dbVersion)
|
||||
se.db.GUID = dbGUID
|
||||
se.pages.RemovePage("edit-database")
|
||||
se.pages.RemovePage("main")
|
||||
se.pages.AddPage("main", se.createMainMenu(), true, true)
|
||||
})
|
||||
|
||||
form.AddButton("Back", func() {
|
||||
se.pages.RemovePage("edit-database")
|
||||
})
|
||||
|
||||
form.SetBorder(true).SetTitle(" Edit Database ").SetTitleAlign(tview.AlignLeft)
|
||||
form.SetInputCapture(func(event *tcell.EventKey) *tcell.EventKey {
|
||||
if event.Key() == tcell.KeyEscape {
|
||||
se.showExitConfirmation("edit-database", "main")
|
||||
return nil
|
||||
}
|
||||
return event
|
||||
})
|
||||
|
||||
se.pages.AddPage("edit-database", form, true, true)
|
||||
}
|
||||
139
pkg/ui/dialogs.go
Normal file
139
pkg/ui/dialogs.go
Normal file
@@ -0,0 +1,139 @@
|
||||
package ui
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
|
||||
"github.com/gdamore/tcell/v2"
|
||||
"github.com/rivo/tview"
|
||||
)
|
||||
|
||||
// showExitConfirmation shows a confirmation dialog when trying to exit without saving
|
||||
func (se *SchemaEditor) showExitConfirmation(pageToRemove, pageToSwitchTo string) {
|
||||
modal := tview.NewModal().
|
||||
SetText("Exit without saving changes?").
|
||||
AddButtons([]string{"Cancel", "No, exit without saving"}).
|
||||
SetDoneFunc(func(buttonIndex int, buttonLabel string) {
|
||||
if buttonLabel == "No, exit without saving" {
|
||||
se.pages.RemovePage(pageToRemove)
|
||||
se.pages.SwitchToPage(pageToSwitchTo)
|
||||
}
|
||||
se.pages.RemovePage("exit-confirm")
|
||||
})
|
||||
|
||||
modal.SetInputCapture(func(event *tcell.EventKey) *tcell.EventKey {
|
||||
if event.Key() == tcell.KeyEscape {
|
||||
se.pages.RemovePage("exit-confirm")
|
||||
return nil
|
||||
}
|
||||
return event
|
||||
})
|
||||
|
||||
se.pages.AddPage("exit-confirm", modal, true, true)
|
||||
}
|
||||
|
||||
// showExitEditorConfirm shows confirmation dialog when trying to exit the entire editor
|
||||
func (se *SchemaEditor) showExitEditorConfirm() {
|
||||
modal := tview.NewModal().
|
||||
SetText("Exit RelSpec Editor? Press ESC again to confirm.").
|
||||
AddButtons([]string{"Cancel", "Exit"}).
|
||||
SetDoneFunc(func(buttonIndex int, buttonLabel string) {
|
||||
if buttonLabel == "Exit" {
|
||||
se.app.Stop()
|
||||
}
|
||||
se.pages.RemovePage("exit-editor-confirm")
|
||||
})
|
||||
|
||||
modal.SetInputCapture(func(event *tcell.EventKey) *tcell.EventKey {
|
||||
if event.Key() == tcell.KeyEscape {
|
||||
se.app.Stop()
|
||||
return nil
|
||||
}
|
||||
return event
|
||||
})
|
||||
|
||||
se.pages.AddPage("exit-editor-confirm", modal, true, true)
|
||||
}
|
||||
|
||||
// showDeleteSchemaConfirm shows confirmation dialog for schema deletion
|
||||
func (se *SchemaEditor) showDeleteSchemaConfirm(schemaIndex int) {
|
||||
modal := tview.NewModal().
|
||||
SetText(fmt.Sprintf("Delete schema '%s'? This will delete all tables in this schema.",
|
||||
se.db.Schemas[schemaIndex].Name)).
|
||||
AddButtons([]string{"Cancel", "Delete"}).
|
||||
SetDoneFunc(func(buttonIndex int, buttonLabel string) {
|
||||
if buttonLabel == "Delete" {
|
||||
se.DeleteSchema(schemaIndex)
|
||||
se.pages.RemovePage("schema-editor")
|
||||
se.pages.RemovePage("schemas")
|
||||
se.showSchemaList()
|
||||
}
|
||||
se.pages.RemovePage("confirm-delete-schema")
|
||||
})
|
||||
|
||||
modal.SetInputCapture(func(event *tcell.EventKey) *tcell.EventKey {
|
||||
if event.Key() == tcell.KeyEscape {
|
||||
se.pages.RemovePage("confirm-delete-schema")
|
||||
return nil
|
||||
}
|
||||
return event
|
||||
})
|
||||
|
||||
se.pages.AddPage("confirm-delete-schema", modal, true, true)
|
||||
}
|
||||
|
||||
// showDeleteTableConfirm shows confirmation dialog for table deletion
|
||||
func (se *SchemaEditor) showDeleteTableConfirm(schemaIndex, tableIndex int) {
|
||||
table := se.db.Schemas[schemaIndex].Tables[tableIndex]
|
||||
modal := tview.NewModal().
|
||||
SetText(fmt.Sprintf("Delete table '%s'? This action cannot be undone.",
|
||||
table.Name)).
|
||||
AddButtons([]string{"Cancel", "Delete"}).
|
||||
SetDoneFunc(func(buttonIndex int, buttonLabel string) {
|
||||
if buttonLabel == "Delete" {
|
||||
se.DeleteTable(schemaIndex, tableIndex)
|
||||
schema := se.db.Schemas[schemaIndex]
|
||||
se.pages.RemovePage("table-editor")
|
||||
se.pages.RemovePage("schema-editor")
|
||||
se.showSchemaEditor(schemaIndex, schema)
|
||||
}
|
||||
se.pages.RemovePage("confirm-delete-table")
|
||||
})
|
||||
|
||||
modal.SetInputCapture(func(event *tcell.EventKey) *tcell.EventKey {
|
||||
if event.Key() == tcell.KeyEscape {
|
||||
se.pages.RemovePage("confirm-delete-table")
|
||||
return nil
|
||||
}
|
||||
return event
|
||||
})
|
||||
|
||||
se.pages.AddPage("confirm-delete-table", modal, true, true)
|
||||
}
|
||||
|
||||
// showDeleteColumnConfirm shows confirmation dialog for column deletion
|
||||
func (se *SchemaEditor) showDeleteColumnConfirm(schemaIndex, tableIndex int, columnName string) {
|
||||
modal := tview.NewModal().
|
||||
SetText(fmt.Sprintf("Delete column '%s'? This action cannot be undone.",
|
||||
columnName)).
|
||||
AddButtons([]string{"Cancel", "Delete"}).
|
||||
SetDoneFunc(func(buttonIndex int, buttonLabel string) {
|
||||
if buttonLabel == "Delete" {
|
||||
se.DeleteColumn(schemaIndex, tableIndex, columnName)
|
||||
se.pages.RemovePage("column-editor")
|
||||
se.pages.RemovePage("confirm-delete-column")
|
||||
se.pages.SwitchToPage("table-editor")
|
||||
} else {
|
||||
se.pages.RemovePage("confirm-delete-column")
|
||||
}
|
||||
})
|
||||
|
||||
modal.SetInputCapture(func(event *tcell.EventKey) *tcell.EventKey {
|
||||
if event.Key() == tcell.KeyEscape {
|
||||
se.pages.RemovePage("confirm-delete-column")
|
||||
return nil
|
||||
}
|
||||
return event
|
||||
})
|
||||
|
||||
se.pages.AddPage("confirm-delete-column", modal, true, true)
|
||||
}
|
||||
35
pkg/ui/domain_dataops.go
Normal file
35
pkg/ui/domain_dataops.go
Normal file
@@ -0,0 +1,35 @@
|
||||
package ui
|
||||
|
||||
import (
|
||||
"git.warky.dev/wdevs/relspecgo/pkg/models"
|
||||
)
|
||||
|
||||
// createDomain creates a new domain
|
||||
func (se *SchemaEditor) createDomain(name, description string) {
|
||||
domain := &models.Domain{
|
||||
Name: name,
|
||||
Description: description,
|
||||
Tables: make([]*models.DomainTable, 0),
|
||||
Sequence: uint(len(se.db.Domains)),
|
||||
}
|
||||
|
||||
se.db.Domains = append(se.db.Domains, domain)
|
||||
se.showDomainList()
|
||||
}
|
||||
|
||||
// updateDomain updates an existing domain
|
||||
func (se *SchemaEditor) updateDomain(index int, name, description string) {
|
||||
if index >= 0 && index < len(se.db.Domains) {
|
||||
se.db.Domains[index].Name = name
|
||||
se.db.Domains[index].Description = description
|
||||
se.showDomainList()
|
||||
}
|
||||
}
|
||||
|
||||
// deleteDomain deletes a domain by index
|
||||
func (se *SchemaEditor) deleteDomain(index int) {
|
||||
if index >= 0 && index < len(se.db.Domains) {
|
||||
se.db.Domains = append(se.db.Domains[:index], se.db.Domains[index+1:]...)
|
||||
se.showDomainList()
|
||||
}
|
||||
}
|
||||
258
pkg/ui/domain_screens.go
Normal file
258
pkg/ui/domain_screens.go
Normal file
@@ -0,0 +1,258 @@
|
||||
package ui
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"strings"
|
||||
|
||||
"github.com/gdamore/tcell/v2"
|
||||
"github.com/rivo/tview"
|
||||
|
||||
"git.warky.dev/wdevs/relspecgo/pkg/models"
|
||||
)
|
||||
|
||||
// showDomainList displays the domain management screen
|
||||
func (se *SchemaEditor) showDomainList() {
|
||||
flex := tview.NewFlex().SetDirection(tview.FlexRow)
|
||||
|
||||
// Title
|
||||
title := tview.NewTextView().
|
||||
SetText("[::b]Manage Domains").
|
||||
SetDynamicColors(true).
|
||||
SetTextAlign(tview.AlignCenter)
|
||||
|
||||
// Create domains table
|
||||
domainTable := tview.NewTable().SetBorders(true).SetSelectable(true, false).SetFixed(1, 0)
|
||||
|
||||
// Add header row
|
||||
headers := []string{"Name", "Sequence", "Total Tables", "Description"}
|
||||
headerWidths := []int{20, 15, 20}
|
||||
for i, header := range headers {
|
||||
padding := ""
|
||||
if i < len(headerWidths) {
|
||||
padding = strings.Repeat(" ", headerWidths[i]-len(header))
|
||||
}
|
||||
cell := tview.NewTableCell(header + padding).
|
||||
SetTextColor(tcell.ColorYellow).
|
||||
SetSelectable(false).
|
||||
SetAlign(tview.AlignLeft)
|
||||
domainTable.SetCell(0, i, cell)
|
||||
}
|
||||
|
||||
// Add existing domains
|
||||
for row, domain := range se.db.Domains {
|
||||
domain := domain // capture for closure
|
||||
|
||||
// Name - pad to 20 chars
|
||||
nameStr := fmt.Sprintf("%-20s", domain.Name)
|
||||
nameCell := tview.NewTableCell(nameStr).SetSelectable(true)
|
||||
domainTable.SetCell(row+1, 0, nameCell)
|
||||
|
||||
// Sequence - pad to 15 chars
|
||||
seqStr := fmt.Sprintf("%-15s", fmt.Sprintf("%d", domain.Sequence))
|
||||
seqCell := tview.NewTableCell(seqStr).SetSelectable(true)
|
||||
domainTable.SetCell(row+1, 1, seqCell)
|
||||
|
||||
// Total Tables - pad to 20 chars
|
||||
tablesStr := fmt.Sprintf("%-20s", fmt.Sprintf("%d", len(domain.Tables)))
|
||||
tablesCell := tview.NewTableCell(tablesStr).SetSelectable(true)
|
||||
domainTable.SetCell(row+1, 2, tablesCell)
|
||||
|
||||
// Description - no padding, takes remaining space
|
||||
descCell := tview.NewTableCell(domain.Description).SetSelectable(true)
|
||||
domainTable.SetCell(row+1, 3, descCell)
|
||||
}
|
||||
|
||||
domainTable.SetTitle(" Domains ").SetBorder(true).SetTitleAlign(tview.AlignLeft)
|
||||
|
||||
// Action buttons flex
|
||||
btnFlex := tview.NewFlex()
|
||||
btnNewDomain := tview.NewButton("New Domain [n]").SetSelectedFunc(func() {
|
||||
se.showNewDomainDialog()
|
||||
})
|
||||
btnBack := tview.NewButton("Back [b]").SetSelectedFunc(func() {
|
||||
se.pages.SwitchToPage("main")
|
||||
se.pages.RemovePage("domains")
|
||||
})
|
||||
|
||||
// Set up button input captures for Tab/Shift+Tab navigation
|
||||
btnNewDomain.SetInputCapture(func(event *tcell.EventKey) *tcell.EventKey {
|
||||
if event.Key() == tcell.KeyBacktab {
|
||||
se.app.SetFocus(domainTable)
|
||||
return nil
|
||||
}
|
||||
if event.Key() == tcell.KeyTab {
|
||||
se.app.SetFocus(btnBack)
|
||||
return nil
|
||||
}
|
||||
return event
|
||||
})
|
||||
|
||||
btnBack.SetInputCapture(func(event *tcell.EventKey) *tcell.EventKey {
|
||||
if event.Key() == tcell.KeyBacktab {
|
||||
se.app.SetFocus(btnNewDomain)
|
||||
return nil
|
||||
}
|
||||
if event.Key() == tcell.KeyTab {
|
||||
se.app.SetFocus(domainTable)
|
||||
return nil
|
||||
}
|
||||
return event
|
||||
})
|
||||
|
||||
btnFlex.AddItem(btnNewDomain, 0, 1, true).
|
||||
AddItem(btnBack, 0, 1, false)
|
||||
|
||||
domainTable.SetInputCapture(func(event *tcell.EventKey) *tcell.EventKey {
|
||||
if event.Key() == tcell.KeyEscape {
|
||||
se.pages.SwitchToPage("main")
|
||||
se.pages.RemovePage("domains")
|
||||
return nil
|
||||
}
|
||||
if event.Key() == tcell.KeyTab {
|
||||
se.app.SetFocus(btnNewDomain)
|
||||
return nil
|
||||
}
|
||||
if event.Key() == tcell.KeyEnter {
|
||||
row, _ := domainTable.GetSelection()
|
||||
if row > 0 && row <= len(se.db.Domains) { // Skip header row
|
||||
domainIndex := row - 1
|
||||
se.showDomainEditor(domainIndex, se.db.Domains[domainIndex])
|
||||
return nil
|
||||
}
|
||||
}
|
||||
if event.Rune() == 'n' {
|
||||
se.showNewDomainDialog()
|
||||
return nil
|
||||
}
|
||||
if event.Rune() == 'b' {
|
||||
se.pages.SwitchToPage("main")
|
||||
se.pages.RemovePage("domains")
|
||||
return nil
|
||||
}
|
||||
return event
|
||||
})
|
||||
|
||||
flex.AddItem(title, 1, 0, false).
|
||||
AddItem(domainTable, 0, 1, true).
|
||||
AddItem(btnFlex, 1, 0, false)
|
||||
|
||||
se.pages.AddPage("domains", flex, true, true)
|
||||
}
|
||||
|
||||
// showNewDomainDialog displays a dialog to create a new domain
|
||||
func (se *SchemaEditor) showNewDomainDialog() {
|
||||
form := tview.NewForm()
|
||||
|
||||
domainName := ""
|
||||
domainDesc := ""
|
||||
|
||||
form.AddInputField("Name", "", 40, nil, func(value string) {
|
||||
domainName = value
|
||||
})
|
||||
|
||||
form.AddInputField("Description", "", 50, nil, func(value string) {
|
||||
domainDesc = value
|
||||
})
|
||||
|
||||
form.AddButton("Save", func() {
|
||||
if domainName == "" {
|
||||
return
|
||||
}
|
||||
se.createDomain(domainName, domainDesc)
|
||||
se.pages.RemovePage("new-domain")
|
||||
se.pages.RemovePage("domains")
|
||||
se.showDomainList()
|
||||
})
|
||||
|
||||
form.AddButton("Back", func() {
|
||||
se.pages.RemovePage("new-domain")
|
||||
se.pages.RemovePage("domains")
|
||||
se.showDomainList()
|
||||
})
|
||||
|
||||
form.SetBorder(true).SetTitle(" New Domain ").SetTitleAlign(tview.AlignLeft)
|
||||
form.SetInputCapture(func(event *tcell.EventKey) *tcell.EventKey {
|
||||
if event.Key() == tcell.KeyEscape {
|
||||
se.showExitConfirmation("new-domain", "domains")
|
||||
return nil
|
||||
}
|
||||
return event
|
||||
})
|
||||
|
||||
se.pages.AddPage("new-domain", form, true, true)
|
||||
}
|
||||
|
||||
// showDomainEditor displays a dialog to edit an existing domain
|
||||
func (se *SchemaEditor) showDomainEditor(index int, domain *models.Domain) {
|
||||
form := tview.NewForm()
|
||||
|
||||
domainName := domain.Name
|
||||
domainDesc := domain.Description
|
||||
|
||||
form.AddInputField("Name", domainName, 40, nil, func(value string) {
|
||||
domainName = value
|
||||
})
|
||||
|
||||
form.AddInputField("Description", domainDesc, 50, nil, func(value string) {
|
||||
domainDesc = value
|
||||
})
|
||||
|
||||
form.AddButton("Save", func() {
|
||||
if domainName == "" {
|
||||
return
|
||||
}
|
||||
se.updateDomain(index, domainName, domainDesc)
|
||||
se.pages.RemovePage("edit-domain")
|
||||
se.pages.RemovePage("domains")
|
||||
se.showDomainList()
|
||||
})
|
||||
|
||||
form.AddButton("Delete", func() {
|
||||
se.showDeleteDomainConfirm(index)
|
||||
})
|
||||
|
||||
form.AddButton("Back", func() {
|
||||
se.pages.RemovePage("edit-domain")
|
||||
se.pages.RemovePage("domains")
|
||||
se.showDomainList()
|
||||
})
|
||||
|
||||
form.SetBorder(true).SetTitle(" Edit Domain ").SetTitleAlign(tview.AlignLeft)
|
||||
form.SetInputCapture(func(event *tcell.EventKey) *tcell.EventKey {
|
||||
if event.Key() == tcell.KeyEscape {
|
||||
se.showExitConfirmation("edit-domain", "domains")
|
||||
return nil
|
||||
}
|
||||
return event
|
||||
})
|
||||
|
||||
se.pages.AddPage("edit-domain", form, true, true)
|
||||
}
|
||||
|
||||
// showDeleteDomainConfirm shows a confirmation dialog before deleting a domain
|
||||
func (se *SchemaEditor) showDeleteDomainConfirm(index int) {
|
||||
modal := tview.NewModal().
|
||||
SetText(fmt.Sprintf("Delete domain '%s'? This action cannot be undone.", se.db.Domains[index].Name)).
|
||||
AddButtons([]string{"Cancel", "Delete"}).
|
||||
SetDoneFunc(func(buttonIndex int, buttonLabel string) {
|
||||
if buttonLabel == "Delete" {
|
||||
se.deleteDomain(index)
|
||||
se.pages.RemovePage("delete-domain-confirm")
|
||||
se.pages.RemovePage("edit-domain")
|
||||
se.pages.RemovePage("domains")
|
||||
se.showDomainList()
|
||||
} else {
|
||||
se.pages.RemovePage("delete-domain-confirm")
|
||||
}
|
||||
})
|
||||
|
||||
modal.SetInputCapture(func(event *tcell.EventKey) *tcell.EventKey {
|
||||
if event.Key() == tcell.KeyEscape {
|
||||
se.pages.RemovePage("delete-domain-confirm")
|
||||
return nil
|
||||
}
|
||||
return event
|
||||
})
|
||||
|
||||
se.pages.AddAndSwitchToPage("delete-domain-confirm", modal, true)
|
||||
}
|
||||
73
pkg/ui/editor.go
Normal file
73
pkg/ui/editor.go
Normal file
@@ -0,0 +1,73 @@
|
||||
package ui
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
|
||||
"github.com/rivo/tview"
|
||||
|
||||
"git.warky.dev/wdevs/relspecgo/pkg/models"
|
||||
)
|
||||
|
||||
// SchemaEditor represents the interactive schema editor
|
||||
type SchemaEditor struct {
|
||||
db *models.Database
|
||||
app *tview.Application
|
||||
pages *tview.Pages
|
||||
loadConfig *LoadConfig
|
||||
saveConfig *SaveConfig
|
||||
}
|
||||
|
||||
// NewSchemaEditor creates a new schema editor
|
||||
func NewSchemaEditor(db *models.Database) *SchemaEditor {
|
||||
return &SchemaEditor{
|
||||
db: db,
|
||||
app: tview.NewApplication(),
|
||||
pages: tview.NewPages(),
|
||||
loadConfig: nil,
|
||||
saveConfig: nil,
|
||||
}
|
||||
}
|
||||
|
||||
// NewSchemaEditorWithConfigs creates a new schema editor with load/save configurations
|
||||
func NewSchemaEditorWithConfigs(db *models.Database, loadConfig *LoadConfig, saveConfig *SaveConfig) *SchemaEditor {
|
||||
return &SchemaEditor{
|
||||
db: db,
|
||||
app: tview.NewApplication(),
|
||||
pages: tview.NewPages(),
|
||||
loadConfig: loadConfig,
|
||||
saveConfig: saveConfig,
|
||||
}
|
||||
}
|
||||
|
||||
// Run starts the interactive editor
|
||||
func (se *SchemaEditor) Run() error {
|
||||
// If no database is loaded, show load screen
|
||||
if se.db == nil {
|
||||
se.showLoadScreen()
|
||||
} else {
|
||||
// Create main menu view
|
||||
mainMenu := se.createMainMenu()
|
||||
se.pages.AddPage("main", mainMenu, true, true)
|
||||
}
|
||||
|
||||
// Run the application
|
||||
if err := se.app.SetRoot(se.pages, true).Run(); err != nil {
|
||||
return fmt.Errorf("application error: %w", err)
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
// GetDatabase returns the current database
|
||||
func (se *SchemaEditor) GetDatabase() *models.Database {
|
||||
return se.db
|
||||
}
|
||||
|
||||
// Helper function to get sorted column names
|
||||
func getColumnNames(table *models.Table) []string {
|
||||
names := make([]string, 0, len(table.Columns))
|
||||
for name := range table.Columns {
|
||||
names = append(names, name)
|
||||
}
|
||||
return names
|
||||
}
|
||||
791
pkg/ui/load_save_screens.go
Normal file
791
pkg/ui/load_save_screens.go
Normal file
@@ -0,0 +1,791 @@
|
||||
package ui
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"os"
|
||||
"path/filepath"
|
||||
"strings"
|
||||
|
||||
"github.com/gdamore/tcell/v2"
|
||||
"github.com/rivo/tview"
|
||||
|
||||
"git.warky.dev/wdevs/relspecgo/pkg/merge"
|
||||
"git.warky.dev/wdevs/relspecgo/pkg/models"
|
||||
"git.warky.dev/wdevs/relspecgo/pkg/readers"
|
||||
rbun "git.warky.dev/wdevs/relspecgo/pkg/readers/bun"
|
||||
rdbml "git.warky.dev/wdevs/relspecgo/pkg/readers/dbml"
|
||||
rdctx "git.warky.dev/wdevs/relspecgo/pkg/readers/dctx"
|
||||
rdrawdb "git.warky.dev/wdevs/relspecgo/pkg/readers/drawdb"
|
||||
rdrizzle "git.warky.dev/wdevs/relspecgo/pkg/readers/drizzle"
|
||||
rgorm "git.warky.dev/wdevs/relspecgo/pkg/readers/gorm"
|
||||
rgraphql "git.warky.dev/wdevs/relspecgo/pkg/readers/graphql"
|
||||
rjson "git.warky.dev/wdevs/relspecgo/pkg/readers/json"
|
||||
rpgsql "git.warky.dev/wdevs/relspecgo/pkg/readers/pgsql"
|
||||
rprisma "git.warky.dev/wdevs/relspecgo/pkg/readers/prisma"
|
||||
rtypeorm "git.warky.dev/wdevs/relspecgo/pkg/readers/typeorm"
|
||||
ryaml "git.warky.dev/wdevs/relspecgo/pkg/readers/yaml"
|
||||
"git.warky.dev/wdevs/relspecgo/pkg/writers"
|
||||
wbun "git.warky.dev/wdevs/relspecgo/pkg/writers/bun"
|
||||
wdbml "git.warky.dev/wdevs/relspecgo/pkg/writers/dbml"
|
||||
wdctx "git.warky.dev/wdevs/relspecgo/pkg/writers/dctx"
|
||||
wdrawdb "git.warky.dev/wdevs/relspecgo/pkg/writers/drawdb"
|
||||
wdrizzle "git.warky.dev/wdevs/relspecgo/pkg/writers/drizzle"
|
||||
wgorm "git.warky.dev/wdevs/relspecgo/pkg/writers/gorm"
|
||||
wgraphql "git.warky.dev/wdevs/relspecgo/pkg/writers/graphql"
|
||||
wjson "git.warky.dev/wdevs/relspecgo/pkg/writers/json"
|
||||
wpgsql "git.warky.dev/wdevs/relspecgo/pkg/writers/pgsql"
|
||||
wprisma "git.warky.dev/wdevs/relspecgo/pkg/writers/prisma"
|
||||
wtypeorm "git.warky.dev/wdevs/relspecgo/pkg/writers/typeorm"
|
||||
wyaml "git.warky.dev/wdevs/relspecgo/pkg/writers/yaml"
|
||||
)
|
||||
|
||||
// LoadConfig holds the configuration for loading a database
|
||||
type LoadConfig struct {
|
||||
SourceType string
|
||||
FilePath string
|
||||
ConnString string
|
||||
}
|
||||
|
||||
// SaveConfig holds the configuration for saving a database
|
||||
type SaveConfig struct {
|
||||
TargetType string
|
||||
FilePath string
|
||||
ConnString string
|
||||
}
|
||||
|
||||
// showLoadScreen displays the database load screen
|
||||
func (se *SchemaEditor) showLoadScreen() {
|
||||
flex := tview.NewFlex().SetDirection(tview.FlexRow)
|
||||
|
||||
// Title
|
||||
title := tview.NewTextView().
|
||||
SetText("[::b]Load Database Schema").
|
||||
SetTextAlign(tview.AlignCenter).
|
||||
SetDynamicColors(true)
|
||||
|
||||
// Form
|
||||
form := tview.NewForm()
|
||||
form.SetBorder(true).SetTitle(" Load Configuration ").SetTitleAlign(tview.AlignLeft)
|
||||
|
||||
// Format selection
|
||||
formatOptions := []string{
|
||||
"dbml", "dctx", "drawdb", "graphql", "json", "yaml",
|
||||
"gorm", "bun", "drizzle", "prisma", "typeorm", "pgsql",
|
||||
}
|
||||
selectedFormat := 0
|
||||
currentFormat := formatOptions[selectedFormat]
|
||||
|
||||
// File path input
|
||||
filePath := ""
|
||||
connString := ""
|
||||
|
||||
form.AddDropDown("Format", formatOptions, 0, func(option string, index int) {
|
||||
selectedFormat = index
|
||||
currentFormat = option
|
||||
})
|
||||
|
||||
form.AddInputField("File Path", "", 50, nil, func(value string) {
|
||||
filePath = value
|
||||
})
|
||||
|
||||
form.AddInputField("Connection String", "", 50, nil, func(value string) {
|
||||
connString = value
|
||||
})
|
||||
|
||||
form.AddTextView("Help", getLoadHelpText(), 0, 5, true, false)
|
||||
|
||||
// Buttons
|
||||
form.AddButton("Load [l]", func() {
|
||||
se.loadDatabase(currentFormat, filePath, connString)
|
||||
})
|
||||
|
||||
form.AddButton("Create New [n]", func() {
|
||||
se.createNewDatabase()
|
||||
})
|
||||
|
||||
form.AddButton("Exit [q]", func() {
|
||||
se.app.Stop()
|
||||
})
|
||||
|
||||
// Keyboard shortcuts
|
||||
form.SetInputCapture(func(event *tcell.EventKey) *tcell.EventKey {
|
||||
if event.Key() == tcell.KeyEscape {
|
||||
se.app.Stop()
|
||||
return nil
|
||||
}
|
||||
switch event.Rune() {
|
||||
case 'l':
|
||||
se.loadDatabase(currentFormat, filePath, connString)
|
||||
return nil
|
||||
case 'n':
|
||||
se.createNewDatabase()
|
||||
return nil
|
||||
case 'q':
|
||||
se.app.Stop()
|
||||
return nil
|
||||
}
|
||||
return event
|
||||
})
|
||||
|
||||
// Tab navigation
|
||||
form.SetInputCapture(func(event *tcell.EventKey) *tcell.EventKey {
|
||||
if event.Key() == tcell.KeyEscape {
|
||||
se.app.Stop()
|
||||
return nil
|
||||
}
|
||||
if event.Rune() == 'l' || event.Rune() == 'n' || event.Rune() == 'q' {
|
||||
return event
|
||||
}
|
||||
return event
|
||||
})
|
||||
|
||||
flex.AddItem(title, 1, 0, false).
|
||||
AddItem(form, 0, 1, true)
|
||||
|
||||
se.pages.AddAndSwitchToPage("load-database", flex, true)
|
||||
}
|
||||
|
||||
// showSaveScreen displays the save database screen
|
||||
func (se *SchemaEditor) showSaveScreen() {
|
||||
flex := tview.NewFlex().SetDirection(tview.FlexRow)
|
||||
|
||||
// Title
|
||||
title := tview.NewTextView().
|
||||
SetText("[::b]Save Database Schema").
|
||||
SetTextAlign(tview.AlignCenter).
|
||||
SetDynamicColors(true)
|
||||
|
||||
// Form
|
||||
form := tview.NewForm()
|
||||
form.SetBorder(true).SetTitle(" Save Configuration ").SetTitleAlign(tview.AlignLeft)
|
||||
|
||||
// Format selection
|
||||
formatOptions := []string{
|
||||
"dbml", "dctx", "drawdb", "graphql", "json", "yaml",
|
||||
"gorm", "bun", "drizzle", "prisma", "typeorm", "pgsql",
|
||||
}
|
||||
selectedFormat := 0
|
||||
currentFormat := formatOptions[selectedFormat]
|
||||
|
||||
// File path input
|
||||
filePath := ""
|
||||
if se.saveConfig != nil {
|
||||
// Pre-populate with existing save config
|
||||
for i, format := range formatOptions {
|
||||
if format == se.saveConfig.TargetType {
|
||||
selectedFormat = i
|
||||
currentFormat = format
|
||||
break
|
||||
}
|
||||
}
|
||||
filePath = se.saveConfig.FilePath
|
||||
}
|
||||
|
||||
form.AddDropDown("Format", formatOptions, selectedFormat, func(option string, index int) {
|
||||
selectedFormat = index
|
||||
currentFormat = option
|
||||
})
|
||||
|
||||
form.AddInputField("File Path", filePath, 50, nil, func(value string) {
|
||||
filePath = value
|
||||
})
|
||||
|
||||
form.AddTextView("Help", getSaveHelpText(), 0, 5, true, false)
|
||||
|
||||
// Buttons
|
||||
form.AddButton("Save [s]", func() {
|
||||
se.saveDatabase(currentFormat, filePath)
|
||||
})
|
||||
|
||||
form.AddButton("Update Existing Database [u]", func() {
|
||||
// Use saveConfig if available, otherwise use loadConfig
|
||||
if se.saveConfig != nil {
|
||||
se.showUpdateExistingDatabaseConfirm()
|
||||
} else if se.loadConfig != nil {
|
||||
se.showUpdateExistingDatabaseConfirm()
|
||||
} else {
|
||||
se.showErrorDialog("Error", "No database source found. Use Save instead.")
|
||||
}
|
||||
})
|
||||
|
||||
form.AddButton("Back [b]", func() {
|
||||
se.pages.RemovePage("save-database")
|
||||
se.pages.SwitchToPage("main")
|
||||
})
|
||||
|
||||
// Keyboard shortcuts
|
||||
form.SetInputCapture(func(event *tcell.EventKey) *tcell.EventKey {
|
||||
if event.Key() == tcell.KeyEscape {
|
||||
se.pages.RemovePage("save-database")
|
||||
se.pages.SwitchToPage("main")
|
||||
return nil
|
||||
}
|
||||
switch event.Rune() {
|
||||
case 's':
|
||||
se.saveDatabase(currentFormat, filePath)
|
||||
return nil
|
||||
case 'u':
|
||||
// Use saveConfig if available, otherwise use loadConfig
|
||||
if se.saveConfig != nil {
|
||||
se.showUpdateExistingDatabaseConfirm()
|
||||
} else if se.loadConfig != nil {
|
||||
se.showUpdateExistingDatabaseConfirm()
|
||||
} else {
|
||||
se.showErrorDialog("Error", "No database source found. Use Save instead.")
|
||||
}
|
||||
return nil
|
||||
case 'b':
|
||||
se.pages.RemovePage("save-database")
|
||||
se.pages.SwitchToPage("main")
|
||||
return nil
|
||||
}
|
||||
return event
|
||||
})
|
||||
|
||||
flex.AddItem(title, 1, 0, false).
|
||||
AddItem(form, 0, 1, true)
|
||||
|
||||
se.pages.AddAndSwitchToPage("save-database", flex, true)
|
||||
}
|
||||
|
||||
// loadDatabase loads a database from the specified configuration
|
||||
func (se *SchemaEditor) loadDatabase(format, filePath, connString string) {
|
||||
// Validate input
|
||||
if format == "pgsql" {
|
||||
if connString == "" {
|
||||
se.showErrorDialog("Error", "Connection string is required for PostgreSQL")
|
||||
return
|
||||
}
|
||||
} else {
|
||||
if filePath == "" {
|
||||
se.showErrorDialog("Error", "File path is required for "+format)
|
||||
return
|
||||
}
|
||||
// Expand home directory
|
||||
if len(filePath) > 0 && filePath[0] == '~' {
|
||||
home, err := os.UserHomeDir()
|
||||
if err == nil {
|
||||
filePath = filepath.Join(home, filePath[1:])
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Create reader
|
||||
var reader readers.Reader
|
||||
switch format {
|
||||
case "dbml":
|
||||
reader = rdbml.NewReader(&readers.ReaderOptions{FilePath: filePath})
|
||||
case "dctx":
|
||||
reader = rdctx.NewReader(&readers.ReaderOptions{FilePath: filePath})
|
||||
case "drawdb":
|
||||
reader = rdrawdb.NewReader(&readers.ReaderOptions{FilePath: filePath})
|
||||
case "graphql":
|
||||
reader = rgraphql.NewReader(&readers.ReaderOptions{FilePath: filePath})
|
||||
case "json":
|
||||
reader = rjson.NewReader(&readers.ReaderOptions{FilePath: filePath})
|
||||
case "yaml":
|
||||
reader = ryaml.NewReader(&readers.ReaderOptions{FilePath: filePath})
|
||||
case "gorm":
|
||||
reader = rgorm.NewReader(&readers.ReaderOptions{FilePath: filePath})
|
||||
case "bun":
|
||||
reader = rbun.NewReader(&readers.ReaderOptions{FilePath: filePath})
|
||||
case "drizzle":
|
||||
reader = rdrizzle.NewReader(&readers.ReaderOptions{FilePath: filePath})
|
||||
case "prisma":
|
||||
reader = rprisma.NewReader(&readers.ReaderOptions{FilePath: filePath})
|
||||
case "typeorm":
|
||||
reader = rtypeorm.NewReader(&readers.ReaderOptions{FilePath: filePath})
|
||||
case "pgsql":
|
||||
reader = rpgsql.NewReader(&readers.ReaderOptions{ConnectionString: connString})
|
||||
default:
|
||||
se.showErrorDialog("Error", "Unsupported format: "+format)
|
||||
return
|
||||
}
|
||||
|
||||
// Read database
|
||||
db, err := reader.ReadDatabase()
|
||||
if err != nil {
|
||||
se.showErrorDialog("Load Error", fmt.Sprintf("Failed to load database: %v", err))
|
||||
return
|
||||
}
|
||||
|
||||
// Store load config
|
||||
se.loadConfig = &LoadConfig{
|
||||
SourceType: format,
|
||||
FilePath: filePath,
|
||||
ConnString: connString,
|
||||
}
|
||||
|
||||
// Update database
|
||||
se.db = db
|
||||
|
||||
// Show success and switch to main menu
|
||||
se.showSuccessDialog("Load Complete", fmt.Sprintf("Successfully loaded database '%s'", db.Name), func() {
|
||||
se.pages.RemovePage("load-database")
|
||||
se.pages.RemovePage("main")
|
||||
se.pages.AddPage("main", se.createMainMenu(), true, true)
|
||||
})
|
||||
}
|
||||
|
||||
// saveDatabase saves the database to the specified configuration
|
||||
func (se *SchemaEditor) saveDatabase(format, filePath string) {
|
||||
// Validate input
|
||||
if format == "pgsql" {
|
||||
se.showErrorDialog("Error", "Direct PostgreSQL save is not supported from the UI. Use --to pgsql --to-path output.sql")
|
||||
return
|
||||
}
|
||||
|
||||
if filePath == "" {
|
||||
se.showErrorDialog("Error", "File path is required")
|
||||
return
|
||||
}
|
||||
|
||||
// Expand home directory
|
||||
if len(filePath) > 0 && filePath[0] == '~' {
|
||||
home, err := os.UserHomeDir()
|
||||
if err == nil {
|
||||
filePath = filepath.Join(home, filePath[1:])
|
||||
}
|
||||
}
|
||||
|
||||
// Create writer
|
||||
var writer writers.Writer
|
||||
switch format {
|
||||
case "dbml":
|
||||
writer = wdbml.NewWriter(&writers.WriterOptions{OutputPath: filePath})
|
||||
case "dctx":
|
||||
writer = wdctx.NewWriter(&writers.WriterOptions{OutputPath: filePath})
|
||||
case "drawdb":
|
||||
writer = wdrawdb.NewWriter(&writers.WriterOptions{OutputPath: filePath})
|
||||
case "graphql":
|
||||
writer = wgraphql.NewWriter(&writers.WriterOptions{OutputPath: filePath})
|
||||
case "json":
|
||||
writer = wjson.NewWriter(&writers.WriterOptions{OutputPath: filePath})
|
||||
case "yaml":
|
||||
writer = wyaml.NewWriter(&writers.WriterOptions{OutputPath: filePath})
|
||||
case "gorm":
|
||||
writer = wgorm.NewWriter(&writers.WriterOptions{OutputPath: filePath})
|
||||
case "bun":
|
||||
writer = wbun.NewWriter(&writers.WriterOptions{OutputPath: filePath})
|
||||
case "drizzle":
|
||||
writer = wdrizzle.NewWriter(&writers.WriterOptions{OutputPath: filePath})
|
||||
case "prisma":
|
||||
writer = wprisma.NewWriter(&writers.WriterOptions{OutputPath: filePath})
|
||||
case "typeorm":
|
||||
writer = wtypeorm.NewWriter(&writers.WriterOptions{OutputPath: filePath})
|
||||
case "pgsql":
|
||||
writer = wpgsql.NewWriter(&writers.WriterOptions{OutputPath: filePath})
|
||||
default:
|
||||
se.showErrorDialog("Error", "Unsupported format: "+format)
|
||||
return
|
||||
}
|
||||
|
||||
// Write database
|
||||
err := writer.WriteDatabase(se.db)
|
||||
if err != nil {
|
||||
se.showErrorDialog("Save Error", fmt.Sprintf("Failed to save database: %v", err))
|
||||
return
|
||||
}
|
||||
|
||||
// Store save config
|
||||
se.saveConfig = &SaveConfig{
|
||||
TargetType: format,
|
||||
FilePath: filePath,
|
||||
}
|
||||
|
||||
// Show success
|
||||
se.showSuccessDialog("Save Complete", fmt.Sprintf("Successfully saved database to %s", filePath), func() {
|
||||
se.pages.RemovePage("save-database")
|
||||
se.pages.SwitchToPage("main")
|
||||
})
|
||||
}
|
||||
|
||||
// createNewDatabase creates a new empty database
|
||||
func (se *SchemaEditor) createNewDatabase() {
|
||||
// Create a new empty database
|
||||
se.db = &models.Database{
|
||||
Name: "New Database",
|
||||
Schemas: []*models.Schema{},
|
||||
}
|
||||
|
||||
// Clear load config
|
||||
se.loadConfig = nil
|
||||
|
||||
// Show success and switch to main menu
|
||||
se.showSuccessDialog("New Database", "Created new empty database", func() {
|
||||
se.pages.RemovePage("load-database")
|
||||
se.pages.AddPage("main", se.createMainMenu(), true, true)
|
||||
})
|
||||
}
|
||||
|
||||
// showErrorDialog displays an error dialog
|
||||
func (se *SchemaEditor) showErrorDialog(_title, message string) {
|
||||
modal := tview.NewModal().
|
||||
SetText(message).
|
||||
AddButtons([]string{"OK"}).
|
||||
SetDoneFunc(func(buttonIndex int, buttonLabel string) {
|
||||
se.pages.RemovePage("error-dialog")
|
||||
})
|
||||
|
||||
modal.SetInputCapture(func(event *tcell.EventKey) *tcell.EventKey {
|
||||
if event.Key() == tcell.KeyEscape {
|
||||
se.pages.RemovePage("error-dialog")
|
||||
return nil
|
||||
}
|
||||
return event
|
||||
})
|
||||
|
||||
se.pages.AddPage("error-dialog", modal, true, true)
|
||||
}
|
||||
|
||||
// showSuccessDialog displays a success dialog
|
||||
func (se *SchemaEditor) showSuccessDialog(_title, message string, onClose func()) {
|
||||
modal := tview.NewModal().
|
||||
SetText(message).
|
||||
AddButtons([]string{"OK"}).
|
||||
SetDoneFunc(func(buttonIndex int, buttonLabel string) {
|
||||
se.pages.RemovePage("success-dialog")
|
||||
if onClose != nil {
|
||||
onClose()
|
||||
}
|
||||
})
|
||||
|
||||
modal.SetInputCapture(func(event *tcell.EventKey) *tcell.EventKey {
|
||||
if event.Key() == tcell.KeyEscape {
|
||||
se.pages.RemovePage("success-dialog")
|
||||
if onClose != nil {
|
||||
onClose()
|
||||
}
|
||||
return nil
|
||||
}
|
||||
return event
|
||||
})
|
||||
|
||||
se.pages.AddPage("success-dialog", modal, true, true)
|
||||
}
|
||||
|
||||
// getLoadHelpText returns the help text for the load screen
|
||||
func getLoadHelpText() string {
|
||||
return `File-based formats: dbml, dctx, drawdb, graphql, json, yaml, gorm, bun, drizzle, prisma, typeorm
|
||||
Database formats: pgsql (requires connection string)
|
||||
|
||||
Examples:
|
||||
- File path: ~/schemas/mydb.dbml or /path/to/schema.json
|
||||
- Connection: postgres://user:pass@localhost/dbname`
|
||||
}
|
||||
|
||||
// showUpdateExistingDatabaseConfirm displays a confirmation dialog before updating existing database
|
||||
func (se *SchemaEditor) showUpdateExistingDatabaseConfirm() {
|
||||
// Use saveConfig if available, otherwise use loadConfig
|
||||
var targetType, targetPath string
|
||||
if se.saveConfig != nil {
|
||||
targetType = se.saveConfig.TargetType
|
||||
targetPath = se.saveConfig.FilePath
|
||||
} else if se.loadConfig != nil {
|
||||
targetType = se.loadConfig.SourceType
|
||||
targetPath = se.loadConfig.FilePath
|
||||
} else {
|
||||
return
|
||||
}
|
||||
|
||||
confirmText := fmt.Sprintf("Update existing database?\n\nFormat: %s\nPath: %s\n\nThis will overwrite the source.",
|
||||
targetType, targetPath)
|
||||
|
||||
modal := tview.NewModal().
|
||||
SetText(confirmText).
|
||||
AddButtons([]string{"Cancel", "Update"}).
|
||||
SetDoneFunc(func(buttonIndex int, buttonLabel string) {
|
||||
if buttonLabel == "Update" {
|
||||
se.pages.RemovePage("update-confirm")
|
||||
se.pages.RemovePage("save-database")
|
||||
se.saveDatabase(targetType, targetPath)
|
||||
se.pages.SwitchToPage("main")
|
||||
} else {
|
||||
se.pages.RemovePage("update-confirm")
|
||||
}
|
||||
})
|
||||
|
||||
modal.SetInputCapture(func(event *tcell.EventKey) *tcell.EventKey {
|
||||
if event.Key() == tcell.KeyEscape {
|
||||
se.pages.RemovePage("update-confirm")
|
||||
return nil
|
||||
}
|
||||
return event
|
||||
})
|
||||
|
||||
se.pages.AddAndSwitchToPage("update-confirm", modal, true)
|
||||
}
|
||||
|
||||
// getSaveHelpText returns the help text for the save screen
|
||||
func getSaveHelpText() string {
|
||||
return `File-based formats: dbml, dctx, drawdb, graphql, json, yaml, gorm, bun, drizzle, prisma, typeorm, pgsql (SQL export)
|
||||
|
||||
Examples:
|
||||
- File: ~/schemas/mydb.dbml
|
||||
- Directory (for code formats): ./models/`
|
||||
}
|
||||
|
||||
// showImportScreen displays the import/merge database screen
|
||||
func (se *SchemaEditor) showImportScreen() {
|
||||
flex := tview.NewFlex().SetDirection(tview.FlexRow)
|
||||
|
||||
// Title
|
||||
title := tview.NewTextView().
|
||||
SetText("[::b]Import & Merge Database Schema").
|
||||
SetTextAlign(tview.AlignCenter).
|
||||
SetDynamicColors(true)
|
||||
|
||||
// Form
|
||||
form := tview.NewForm()
|
||||
form.SetBorder(true).SetTitle(" Import Configuration ").SetTitleAlign(tview.AlignLeft)
|
||||
|
||||
// Format selection
|
||||
formatOptions := []string{
|
||||
"dbml", "dctx", "drawdb", "graphql", "json", "yaml",
|
||||
"gorm", "bun", "drizzle", "prisma", "typeorm", "pgsql",
|
||||
}
|
||||
selectedFormat := 0
|
||||
currentFormat := formatOptions[selectedFormat]
|
||||
|
||||
// File path input
|
||||
filePath := ""
|
||||
connString := ""
|
||||
skipDomains := false
|
||||
skipRelations := false
|
||||
skipEnums := false
|
||||
skipViews := false
|
||||
skipSequences := false
|
||||
skipTables := ""
|
||||
|
||||
form.AddDropDown("Format", formatOptions, 0, func(option string, index int) {
|
||||
selectedFormat = index
|
||||
currentFormat = option
|
||||
})
|
||||
|
||||
form.AddInputField("File Path", "", 50, nil, func(value string) {
|
||||
filePath = value
|
||||
})
|
||||
|
||||
form.AddInputField("Connection String", "", 50, nil, func(value string) {
|
||||
connString = value
|
||||
})
|
||||
|
||||
form.AddInputField("Skip Tables (comma-separated)", "", 50, nil, func(value string) {
|
||||
skipTables = value
|
||||
})
|
||||
|
||||
form.AddCheckbox("Skip Domains", false, func(checked bool) {
|
||||
skipDomains = checked
|
||||
})
|
||||
|
||||
form.AddCheckbox("Skip Relations", false, func(checked bool) {
|
||||
skipRelations = checked
|
||||
})
|
||||
|
||||
form.AddCheckbox("Skip Enums", false, func(checked bool) {
|
||||
skipEnums = checked
|
||||
})
|
||||
|
||||
form.AddCheckbox("Skip Views", false, func(checked bool) {
|
||||
skipViews = checked
|
||||
})
|
||||
|
||||
form.AddCheckbox("Skip Sequences", false, func(checked bool) {
|
||||
skipSequences = checked
|
||||
})
|
||||
|
||||
form.AddTextView("Help", getImportHelpText(), 0, 7, true, false)
|
||||
|
||||
// Buttons
|
||||
form.AddButton("Import & Merge [i]", func() {
|
||||
se.importAndMergeDatabase(currentFormat, filePath, connString, skipDomains, skipRelations, skipEnums, skipViews, skipSequences, skipTables)
|
||||
})
|
||||
|
||||
form.AddButton("Back [b]", func() {
|
||||
se.pages.RemovePage("import-database")
|
||||
se.pages.SwitchToPage("main")
|
||||
})
|
||||
|
||||
form.AddButton("Exit [q]", func() {
|
||||
se.app.Stop()
|
||||
})
|
||||
|
||||
// Keyboard shortcuts
|
||||
form.SetInputCapture(func(event *tcell.EventKey) *tcell.EventKey {
|
||||
if event.Key() == tcell.KeyEscape {
|
||||
se.pages.RemovePage("import-database")
|
||||
se.pages.SwitchToPage("main")
|
||||
return nil
|
||||
}
|
||||
switch event.Rune() {
|
||||
case 'i':
|
||||
se.importAndMergeDatabase(currentFormat, filePath, connString, skipDomains, skipRelations, skipEnums, skipViews, skipSequences, skipTables)
|
||||
return nil
|
||||
case 'b':
|
||||
se.pages.RemovePage("import-database")
|
||||
se.pages.SwitchToPage("main")
|
||||
return nil
|
||||
case 'q':
|
||||
se.app.Stop()
|
||||
return nil
|
||||
}
|
||||
return event
|
||||
})
|
||||
|
||||
flex.AddItem(title, 1, 0, false).
|
||||
AddItem(form, 0, 1, true)
|
||||
|
||||
se.pages.AddAndSwitchToPage("import-database", flex, true)
|
||||
}
|
||||
|
||||
// importAndMergeDatabase imports and merges a database from the specified configuration
|
||||
func (se *SchemaEditor) importAndMergeDatabase(format, filePath, connString string, skipDomains, skipRelations, skipEnums, skipViews, skipSequences bool, skipTables string) {
|
||||
// Validate input
|
||||
if format == "pgsql" {
|
||||
if connString == "" {
|
||||
se.showErrorDialog("Error", "Connection string is required for PostgreSQL")
|
||||
return
|
||||
}
|
||||
} else {
|
||||
if filePath == "" {
|
||||
se.showErrorDialog("Error", "File path is required for "+format)
|
||||
return
|
||||
}
|
||||
// Expand home directory
|
||||
if len(filePath) > 0 && filePath[0] == '~' {
|
||||
home, err := os.UserHomeDir()
|
||||
if err == nil {
|
||||
filePath = filepath.Join(home, filePath[1:])
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Create reader
|
||||
var reader readers.Reader
|
||||
switch format {
|
||||
case "dbml":
|
||||
reader = rdbml.NewReader(&readers.ReaderOptions{FilePath: filePath})
|
||||
case "dctx":
|
||||
reader = rdctx.NewReader(&readers.ReaderOptions{FilePath: filePath})
|
||||
case "drawdb":
|
||||
reader = rdrawdb.NewReader(&readers.ReaderOptions{FilePath: filePath})
|
||||
case "graphql":
|
||||
reader = rgraphql.NewReader(&readers.ReaderOptions{FilePath: filePath})
|
||||
case "json":
|
||||
reader = rjson.NewReader(&readers.ReaderOptions{FilePath: filePath})
|
||||
case "yaml":
|
||||
reader = ryaml.NewReader(&readers.ReaderOptions{FilePath: filePath})
|
||||
case "gorm":
|
||||
reader = rgorm.NewReader(&readers.ReaderOptions{FilePath: filePath})
|
||||
case "bun":
|
||||
reader = rbun.NewReader(&readers.ReaderOptions{FilePath: filePath})
|
||||
case "drizzle":
|
||||
reader = rdrizzle.NewReader(&readers.ReaderOptions{FilePath: filePath})
|
||||
case "prisma":
|
||||
reader = rprisma.NewReader(&readers.ReaderOptions{FilePath: filePath})
|
||||
case "typeorm":
|
||||
reader = rtypeorm.NewReader(&readers.ReaderOptions{FilePath: filePath})
|
||||
case "pgsql":
|
||||
reader = rpgsql.NewReader(&readers.ReaderOptions{ConnectionString: connString})
|
||||
default:
|
||||
se.showErrorDialog("Error", "Unsupported format: "+format)
|
||||
return
|
||||
}
|
||||
|
||||
// Read the database to import
|
||||
importDb, err := reader.ReadDatabase()
|
||||
if err != nil {
|
||||
se.showErrorDialog("Import Error", fmt.Sprintf("Failed to read database: %v", err))
|
||||
return
|
||||
}
|
||||
|
||||
// Show confirmation dialog
|
||||
se.showImportConfirmation(importDb, skipDomains, skipRelations, skipEnums, skipViews, skipSequences, skipTables)
|
||||
}
|
||||
|
||||
// showImportConfirmation shows a confirmation dialog before merging
|
||||
func (se *SchemaEditor) showImportConfirmation(importDb *models.Database, skipDomains, skipRelations, skipEnums, skipViews, skipSequences bool, skipTables string) {
|
||||
confirmText := fmt.Sprintf("Import & Merge Database?\n\nSource: %s\nTarget: %s\n\nThis will add missing schemas, tables, columns, and other objects from the source to your database.\n\nExisting items will NOT be modified.",
|
||||
importDb.Name, se.db.Name)
|
||||
|
||||
modal := tview.NewModal().
|
||||
SetText(confirmText).
|
||||
AddButtons([]string{"Cancel", "Merge"}).
|
||||
SetDoneFunc(func(buttonIndex int, buttonLabel string) {
|
||||
se.pages.RemovePage("import-confirm")
|
||||
if buttonLabel == "Merge" {
|
||||
se.performMerge(importDb, skipDomains, skipRelations, skipEnums, skipViews, skipSequences, skipTables)
|
||||
}
|
||||
})
|
||||
|
||||
modal.SetInputCapture(func(event *tcell.EventKey) *tcell.EventKey {
|
||||
if event.Key() == tcell.KeyEscape {
|
||||
se.pages.RemovePage("import-confirm")
|
||||
se.pages.SwitchToPage("import-database")
|
||||
return nil
|
||||
}
|
||||
return event
|
||||
})
|
||||
|
||||
se.pages.AddAndSwitchToPage("import-confirm", modal, true)
|
||||
}
|
||||
|
||||
// performMerge performs the actual merge operation
|
||||
func (se *SchemaEditor) performMerge(importDb *models.Database, skipDomains, skipRelations, skipEnums, skipViews, skipSequences bool, skipTables string) {
|
||||
// Create merge options
|
||||
opts := &merge.MergeOptions{
|
||||
SkipDomains: skipDomains,
|
||||
SkipRelations: skipRelations,
|
||||
SkipEnums: skipEnums,
|
||||
SkipViews: skipViews,
|
||||
SkipSequences: skipSequences,
|
||||
}
|
||||
|
||||
// Parse skip tables
|
||||
if skipTables != "" {
|
||||
opts.SkipTableNames = parseSkipTablesUI(skipTables)
|
||||
}
|
||||
|
||||
// Perform the merge
|
||||
result := merge.MergeDatabases(se.db, importDb, opts)
|
||||
|
||||
// Update the database timestamp
|
||||
se.db.UpdateDate()
|
||||
|
||||
// Show success dialog with summary
|
||||
summary := merge.GetMergeSummary(result)
|
||||
se.showSuccessDialog("Import Complete", summary, func() {
|
||||
se.pages.RemovePage("import-database")
|
||||
se.pages.RemovePage("main")
|
||||
se.pages.AddPage("main", se.createMainMenu(), true, true)
|
||||
})
|
||||
}
|
||||
|
||||
// getImportHelpText returns the help text for the import screen
|
||||
func getImportHelpText() string {
|
||||
return `Import & Merge: Adds missing schemas, tables, columns, and other objects to your existing database.
|
||||
|
||||
File-based formats: dbml, dctx, drawdb, graphql, json, yaml, gorm, bun, drizzle, prisma, typeorm
|
||||
Database formats: pgsql (requires connection string)
|
||||
|
||||
Skip options: Check to exclude specific object types from the merge.`
|
||||
}
|
||||
|
||||
func parseSkipTablesUI(skipTablesStr string) map[string]bool {
|
||||
skipTables := make(map[string]bool)
|
||||
if skipTablesStr == "" {
|
||||
return skipTables
|
||||
}
|
||||
|
||||
// Split by comma and trim whitespace
|
||||
parts := strings.Split(skipTablesStr, ",")
|
||||
for _, part := range parts {
|
||||
trimmed := strings.TrimSpace(part)
|
||||
if trimmed != "" {
|
||||
// Store in lowercase for case-insensitive matching
|
||||
skipTables[strings.ToLower(trimmed)] = true
|
||||
}
|
||||
}
|
||||
|
||||
return skipTables
|
||||
}
|
||||
65
pkg/ui/main_menu.go
Normal file
65
pkg/ui/main_menu.go
Normal file
@@ -0,0 +1,65 @@
|
||||
package ui
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
|
||||
"github.com/gdamore/tcell/v2"
|
||||
"github.com/rivo/tview"
|
||||
)
|
||||
|
||||
// createMainMenu creates the main menu screen
|
||||
func (se *SchemaEditor) createMainMenu() tview.Primitive {
|
||||
flex := tview.NewFlex().SetDirection(tview.FlexRow)
|
||||
|
||||
// Title with database name
|
||||
dbName := se.db.Name
|
||||
if dbName == "" {
|
||||
dbName = "Untitled"
|
||||
}
|
||||
updateAtStr := ""
|
||||
if se.db.UpdatedAt != "" {
|
||||
updateAtStr = fmt.Sprintf("Updated @ %s", se.db.UpdatedAt)
|
||||
}
|
||||
titleText := fmt.Sprintf("[::b]RelSpec Schema Editor\n[::d]Database: %s %s\n[::d]Press arrow keys to navigate, Enter to select", dbName, updateAtStr)
|
||||
title := tview.NewTextView().
|
||||
SetText(titleText).
|
||||
SetDynamicColors(true)
|
||||
|
||||
// Menu options
|
||||
menu := tview.NewList().
|
||||
AddItem("Edit Database", "Edit database name, description, and properties", 'e', func() {
|
||||
se.showEditDatabaseForm()
|
||||
}).
|
||||
AddItem("Manage Schemas", "View, create, edit, and delete schemas", 's', func() {
|
||||
se.showSchemaList()
|
||||
}).
|
||||
AddItem("Manage Tables", "View and manage tables in schemas", 't', func() {
|
||||
se.showTableList()
|
||||
}).
|
||||
AddItem("Manage Domains", "View, create, edit, and delete domains", 'd', func() {
|
||||
se.showDomainList()
|
||||
}).
|
||||
AddItem("Import & Merge", "Import and merge schema from another database", 'i', func() {
|
||||
se.showImportScreen()
|
||||
}).
|
||||
AddItem("Save Database", "Save database to file or database", 'w', func() {
|
||||
se.showSaveScreen()
|
||||
}).
|
||||
AddItem("Exit Editor", "Exit the editor", 'q', func() {
|
||||
se.app.Stop()
|
||||
})
|
||||
|
||||
menu.SetBorder(true).SetTitle(" Menu ").SetTitleAlign(tview.AlignLeft)
|
||||
menu.SetInputCapture(func(event *tcell.EventKey) *tcell.EventKey {
|
||||
if event.Key() == tcell.KeyEscape {
|
||||
se.showExitEditorConfirm()
|
||||
return nil
|
||||
}
|
||||
return event
|
||||
})
|
||||
|
||||
flex.AddItem(title, 5, 0, false).
|
||||
AddItem(menu, 0, 1, true)
|
||||
|
||||
return flex
|
||||
}
|
||||
55
pkg/ui/schema_dataops.go
Normal file
55
pkg/ui/schema_dataops.go
Normal file
@@ -0,0 +1,55 @@
|
||||
package ui
|
||||
|
||||
import "git.warky.dev/wdevs/relspecgo/pkg/models"
|
||||
|
||||
// Schema data operations - business logic for schema management
|
||||
|
||||
// CreateSchema creates a new schema and adds it to the database
|
||||
func (se *SchemaEditor) CreateSchema(name, description string) *models.Schema {
|
||||
newSchema := &models.Schema{
|
||||
Name: name,
|
||||
Description: description,
|
||||
Tables: make([]*models.Table, 0),
|
||||
Sequences: make([]*models.Sequence, 0),
|
||||
Enums: make([]*models.Enum, 0),
|
||||
}
|
||||
se.db.UpdateDate()
|
||||
se.db.Schemas = append(se.db.Schemas, newSchema)
|
||||
return newSchema
|
||||
}
|
||||
|
||||
// UpdateSchema updates an existing schema's properties
|
||||
func (se *SchemaEditor) UpdateSchema(schemaIndex int, name, owner, description string) {
|
||||
if schemaIndex < 0 || schemaIndex >= len(se.db.Schemas) {
|
||||
return
|
||||
}
|
||||
se.db.UpdateDate()
|
||||
schema := se.db.Schemas[schemaIndex]
|
||||
schema.Name = name
|
||||
schema.Owner = owner
|
||||
schema.Description = description
|
||||
schema.UpdateDate()
|
||||
}
|
||||
|
||||
// DeleteSchema removes a schema from the database
|
||||
func (se *SchemaEditor) DeleteSchema(schemaIndex int) bool {
|
||||
if schemaIndex < 0 || schemaIndex >= len(se.db.Schemas) {
|
||||
return false
|
||||
}
|
||||
se.db.UpdateDate()
|
||||
se.db.Schemas = append(se.db.Schemas[:schemaIndex], se.db.Schemas[schemaIndex+1:]...)
|
||||
return true
|
||||
}
|
||||
|
||||
// GetSchema returns a schema by index
|
||||
func (se *SchemaEditor) GetSchema(schemaIndex int) *models.Schema {
|
||||
if schemaIndex < 0 || schemaIndex >= len(se.db.Schemas) {
|
||||
return nil
|
||||
}
|
||||
return se.db.Schemas[schemaIndex]
|
||||
}
|
||||
|
||||
// GetAllSchemas returns all schemas
|
||||
func (se *SchemaEditor) GetAllSchemas() []*models.Schema {
|
||||
return se.db.Schemas
|
||||
}
|
||||
362
pkg/ui/schema_screens.go
Normal file
362
pkg/ui/schema_screens.go
Normal file
@@ -0,0 +1,362 @@
|
||||
package ui
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"strings"
|
||||
|
||||
"github.com/gdamore/tcell/v2"
|
||||
"github.com/rivo/tview"
|
||||
|
||||
"git.warky.dev/wdevs/relspecgo/pkg/models"
|
||||
)
|
||||
|
||||
// showSchemaList displays the schema management screen
|
||||
func (se *SchemaEditor) showSchemaList() {
|
||||
flex := tview.NewFlex().SetDirection(tview.FlexRow)
|
||||
|
||||
// Title
|
||||
title := tview.NewTextView().
|
||||
SetText("[::b]Manage Schemas").
|
||||
SetDynamicColors(true).
|
||||
SetTextAlign(tview.AlignCenter)
|
||||
|
||||
// Create schemas table
|
||||
schemaTable := tview.NewTable().SetBorders(true).SetSelectable(true, false).SetFixed(1, 0)
|
||||
|
||||
// Add header row with padding for full width
|
||||
headers := []string{"Name", "Sequence", "Total Tables", "Total Sequences", "Total Views", "GUID", "Description"}
|
||||
headerWidths := []int{20, 15, 20, 20, 15, 36} // Last column takes remaining space
|
||||
for i, header := range headers {
|
||||
padding := ""
|
||||
if i < len(headerWidths) {
|
||||
padding = strings.Repeat(" ", headerWidths[i]-len(header))
|
||||
}
|
||||
cell := tview.NewTableCell(header + padding).
|
||||
SetTextColor(tcell.ColorYellow).
|
||||
SetSelectable(false).
|
||||
SetAlign(tview.AlignLeft)
|
||||
schemaTable.SetCell(0, i, cell)
|
||||
}
|
||||
|
||||
// Add existing schemas
|
||||
for row, schema := range se.db.Schemas {
|
||||
schema := schema // capture for closure
|
||||
|
||||
// Name - pad to 20 chars
|
||||
nameStr := fmt.Sprintf("%-20s", schema.Name)
|
||||
nameCell := tview.NewTableCell(nameStr).SetSelectable(true)
|
||||
schemaTable.SetCell(row+1, 0, nameCell)
|
||||
|
||||
// Sequence - pad to 15 chars
|
||||
seqStr := fmt.Sprintf("%-15s", fmt.Sprintf("%d", schema.Sequence))
|
||||
seqCell := tview.NewTableCell(seqStr).SetSelectable(true)
|
||||
schemaTable.SetCell(row+1, 1, seqCell)
|
||||
|
||||
// Total Tables - pad to 20 chars
|
||||
tablesStr := fmt.Sprintf("%-20s", fmt.Sprintf("%d", len(schema.Tables)))
|
||||
tablesCell := tview.NewTableCell(tablesStr).SetSelectable(true)
|
||||
schemaTable.SetCell(row+1, 2, tablesCell)
|
||||
|
||||
// Total Sequences - pad to 20 chars
|
||||
sequencesStr := fmt.Sprintf("%-20s", fmt.Sprintf("%d", len(schema.Sequences)))
|
||||
sequencesCell := tview.NewTableCell(sequencesStr).SetSelectable(true)
|
||||
schemaTable.SetCell(row+1, 3, sequencesCell)
|
||||
|
||||
// Total Views - pad to 15 chars
|
||||
viewsStr := fmt.Sprintf("%-15s", fmt.Sprintf("%d", len(schema.Views)))
|
||||
viewsCell := tview.NewTableCell(viewsStr).SetSelectable(true)
|
||||
schemaTable.SetCell(row+1, 4, viewsCell)
|
||||
|
||||
// GUID - pad to 36 chars
|
||||
guidStr := fmt.Sprintf("%-36s", schema.GUID)
|
||||
guidCell := tview.NewTableCell(guidStr).SetSelectable(true)
|
||||
schemaTable.SetCell(row+1, 5, guidCell)
|
||||
|
||||
// Description - no padding, takes remaining space
|
||||
descCell := tview.NewTableCell(schema.Description).SetSelectable(true)
|
||||
schemaTable.SetCell(row+1, 6, descCell)
|
||||
}
|
||||
|
||||
schemaTable.SetTitle(" Schemas ").SetBorder(true).SetTitleAlign(tview.AlignLeft)
|
||||
|
||||
// Action buttons flex (define before input capture)
|
||||
btnFlex := tview.NewFlex()
|
||||
btnNewSchema := tview.NewButton("New Schema [n]").SetSelectedFunc(func() {
|
||||
se.showNewSchemaDialog()
|
||||
})
|
||||
btnBack := tview.NewButton("Back [b]").SetSelectedFunc(func() {
|
||||
se.pages.SwitchToPage("main")
|
||||
se.pages.RemovePage("schemas")
|
||||
})
|
||||
|
||||
// Set up button input captures for Tab/Shift+Tab navigation
|
||||
btnNewSchema.SetInputCapture(func(event *tcell.EventKey) *tcell.EventKey {
|
||||
if event.Key() == tcell.KeyBacktab {
|
||||
se.app.SetFocus(schemaTable)
|
||||
return nil
|
||||
}
|
||||
if event.Key() == tcell.KeyTab {
|
||||
se.app.SetFocus(btnBack)
|
||||
return nil
|
||||
}
|
||||
return event
|
||||
})
|
||||
|
||||
btnBack.SetInputCapture(func(event *tcell.EventKey) *tcell.EventKey {
|
||||
if event.Key() == tcell.KeyBacktab {
|
||||
se.app.SetFocus(btnNewSchema)
|
||||
return nil
|
||||
}
|
||||
if event.Key() == tcell.KeyTab {
|
||||
se.app.SetFocus(schemaTable)
|
||||
return nil
|
||||
}
|
||||
return event
|
||||
})
|
||||
|
||||
btnFlex.AddItem(btnNewSchema, 0, 1, true).
|
||||
AddItem(btnBack, 0, 1, false)
|
||||
|
||||
schemaTable.SetInputCapture(func(event *tcell.EventKey) *tcell.EventKey {
|
||||
if event.Key() == tcell.KeyEscape {
|
||||
se.pages.SwitchToPage("main")
|
||||
se.pages.RemovePage("schemas")
|
||||
return nil
|
||||
}
|
||||
if event.Key() == tcell.KeyTab {
|
||||
se.app.SetFocus(btnNewSchema)
|
||||
return nil
|
||||
}
|
||||
if event.Key() == tcell.KeyEnter {
|
||||
row, _ := schemaTable.GetSelection()
|
||||
if row > 0 && row <= len(se.db.Schemas) { // Skip header row
|
||||
schemaIndex := row - 1
|
||||
se.showSchemaEditor(schemaIndex, se.db.Schemas[schemaIndex])
|
||||
return nil
|
||||
}
|
||||
}
|
||||
if event.Rune() == 'n' {
|
||||
se.showNewSchemaDialog()
|
||||
return nil
|
||||
}
|
||||
if event.Rune() == 'b' {
|
||||
se.pages.SwitchToPage("main")
|
||||
se.pages.RemovePage("schemas")
|
||||
return nil
|
||||
}
|
||||
return event
|
||||
})
|
||||
|
||||
flex.AddItem(title, 1, 0, false).
|
||||
AddItem(schemaTable, 0, 1, true).
|
||||
AddItem(btnFlex, 1, 0, false)
|
||||
|
||||
se.pages.AddPage("schemas", flex, true, true)
|
||||
}
|
||||
|
||||
// showSchemaEditor shows the editor for a specific schema
|
||||
func (se *SchemaEditor) showSchemaEditor(index int, schema *models.Schema) {
|
||||
flex := tview.NewFlex().SetDirection(tview.FlexRow)
|
||||
|
||||
// Title
|
||||
title := tview.NewTextView().
|
||||
SetText(fmt.Sprintf("[::b]Schema: %s", schema.Name)).
|
||||
SetDynamicColors(true).
|
||||
SetTextAlign(tview.AlignCenter)
|
||||
|
||||
// Schema info display
|
||||
info := tview.NewTextView().SetDynamicColors(true)
|
||||
info.SetText(fmt.Sprintf("Tables: %d | Description: %s",
|
||||
len(schema.Tables), schema.Description))
|
||||
|
||||
// Table list
|
||||
tableList := tview.NewList().ShowSecondaryText(true)
|
||||
|
||||
for i, table := range schema.Tables {
|
||||
tableIndex := i
|
||||
table := table
|
||||
colCount := len(table.Columns)
|
||||
tableList.AddItem(table.Name, fmt.Sprintf("%d columns", colCount), rune('0'+i), func() {
|
||||
se.showTableEditor(index, tableIndex, table)
|
||||
})
|
||||
}
|
||||
|
||||
tableList.AddItem("[New Table]", "Add a new table to this schema", 'n', func() {
|
||||
se.showNewTableDialog(index)
|
||||
})
|
||||
|
||||
tableList.AddItem("[Edit Schema Info]", "Edit schema properties", 'e', func() {
|
||||
se.showEditSchemaDialog(index)
|
||||
})
|
||||
|
||||
tableList.AddItem("[Delete Schema]", "Delete this schema", 'd', func() {
|
||||
se.showDeleteSchemaConfirm(index)
|
||||
})
|
||||
|
||||
tableList.SetBorder(true).SetTitle(" Tables ").SetTitleAlign(tview.AlignLeft)
|
||||
|
||||
// Action buttons (define before input capture)
|
||||
btnFlex := tview.NewFlex()
|
||||
btnNewTable := tview.NewButton("New Table [n]").SetSelectedFunc(func() {
|
||||
se.showNewTableDialog(index)
|
||||
})
|
||||
btnBack := tview.NewButton("Back to Schemas [b]").SetSelectedFunc(func() {
|
||||
se.pages.RemovePage("schema-editor")
|
||||
se.pages.SwitchToPage("schemas")
|
||||
})
|
||||
|
||||
// Set up button input captures for Tab/Shift+Tab navigation
|
||||
btnNewTable.SetInputCapture(func(event *tcell.EventKey) *tcell.EventKey {
|
||||
if event.Key() == tcell.KeyBacktab {
|
||||
se.app.SetFocus(tableList)
|
||||
return nil
|
||||
}
|
||||
if event.Key() == tcell.KeyTab {
|
||||
se.app.SetFocus(btnBack)
|
||||
return nil
|
||||
}
|
||||
return event
|
||||
})
|
||||
|
||||
btnBack.SetInputCapture(func(event *tcell.EventKey) *tcell.EventKey {
|
||||
if event.Key() == tcell.KeyBacktab {
|
||||
se.app.SetFocus(btnNewTable)
|
||||
return nil
|
||||
}
|
||||
if event.Key() == tcell.KeyTab {
|
||||
se.app.SetFocus(tableList)
|
||||
return nil
|
||||
}
|
||||
return event
|
||||
})
|
||||
|
||||
tableList.SetInputCapture(func(event *tcell.EventKey) *tcell.EventKey {
|
||||
if event.Key() == tcell.KeyEscape {
|
||||
se.pages.RemovePage("schema-editor")
|
||||
se.pages.SwitchToPage("schemas")
|
||||
return nil
|
||||
}
|
||||
if event.Key() == tcell.KeyTab {
|
||||
se.app.SetFocus(btnNewTable)
|
||||
return nil
|
||||
}
|
||||
if event.Rune() == 'b' {
|
||||
se.pages.RemovePage("schema-editor")
|
||||
se.pages.SwitchToPage("schemas")
|
||||
}
|
||||
return event
|
||||
})
|
||||
|
||||
btnFlex.AddItem(btnNewTable, 0, 1, true).
|
||||
AddItem(btnBack, 0, 1, false)
|
||||
|
||||
flex.AddItem(title, 1, 0, false).
|
||||
AddItem(info, 2, 0, false).
|
||||
AddItem(tableList, 0, 1, true).
|
||||
AddItem(btnFlex, 1, 0, false)
|
||||
|
||||
se.pages.AddPage("schema-editor", flex, true, true)
|
||||
}
|
||||
|
||||
// showNewSchemaDialog shows dialog to create a new schema
|
||||
func (se *SchemaEditor) showNewSchemaDialog() {
|
||||
form := tview.NewForm()
|
||||
|
||||
schemaName := ""
|
||||
description := ""
|
||||
|
||||
form.AddInputField("Schema Name", "", 40, nil, func(value string) {
|
||||
schemaName = value
|
||||
})
|
||||
|
||||
form.AddInputField("Description", "", 40, nil, func(value string) {
|
||||
description = value
|
||||
})
|
||||
|
||||
form.AddButton("Save", func() {
|
||||
if schemaName == "" {
|
||||
return
|
||||
}
|
||||
|
||||
se.CreateSchema(schemaName, description)
|
||||
|
||||
se.pages.RemovePage("new-schema")
|
||||
se.pages.RemovePage("schemas")
|
||||
se.showSchemaList()
|
||||
})
|
||||
|
||||
form.AddButton("Back", func() {
|
||||
se.pages.RemovePage("new-schema")
|
||||
se.pages.RemovePage("schemas")
|
||||
se.showSchemaList()
|
||||
})
|
||||
|
||||
form.SetBorder(true).SetTitle(" New Schema ").SetTitleAlign(tview.AlignLeft)
|
||||
form.SetInputCapture(func(event *tcell.EventKey) *tcell.EventKey {
|
||||
if event.Key() == tcell.KeyEscape {
|
||||
se.showExitConfirmation("new-schema", "schemas")
|
||||
return nil
|
||||
}
|
||||
return event
|
||||
})
|
||||
|
||||
se.pages.AddPage("new-schema", form, true, true)
|
||||
}
|
||||
|
||||
// showEditSchemaDialog shows dialog to edit schema properties
|
||||
func (se *SchemaEditor) showEditSchemaDialog(schemaIndex int) {
|
||||
schema := se.db.Schemas[schemaIndex]
|
||||
form := tview.NewForm()
|
||||
|
||||
// Local variables to collect changes
|
||||
newName := schema.Name
|
||||
newOwner := schema.Owner
|
||||
newDescription := schema.Description
|
||||
newGUID := schema.GUID
|
||||
|
||||
form.AddInputField("Schema Name", schema.Name, 40, nil, func(value string) {
|
||||
newName = value
|
||||
})
|
||||
|
||||
form.AddInputField("Owner", schema.Owner, 40, nil, func(value string) {
|
||||
newOwner = value
|
||||
})
|
||||
|
||||
form.AddTextArea("Description", schema.Description, 40, 5, 0, func(value string) {
|
||||
newDescription = value
|
||||
})
|
||||
|
||||
form.AddInputField("GUID", schema.GUID, 40, nil, func(value string) {
|
||||
newGUID = value
|
||||
})
|
||||
|
||||
form.AddButton("Save", func() {
|
||||
// Apply changes using dataops
|
||||
se.UpdateSchema(schemaIndex, newName, newOwner, newDescription)
|
||||
se.db.Schemas[schemaIndex].GUID = newGUID
|
||||
|
||||
schema := se.db.Schemas[schemaIndex]
|
||||
se.pages.RemovePage("edit-schema")
|
||||
se.pages.RemovePage("schema-editor")
|
||||
se.showSchemaEditor(schemaIndex, schema)
|
||||
})
|
||||
|
||||
form.AddButton("Back", func() {
|
||||
// Discard changes - don't apply them
|
||||
schema := se.db.Schemas[schemaIndex]
|
||||
se.pages.RemovePage("edit-schema")
|
||||
se.pages.RemovePage("schema-editor")
|
||||
se.showSchemaEditor(schemaIndex, schema)
|
||||
})
|
||||
|
||||
form.SetBorder(true).SetTitle(" Edit Schema ").SetTitleAlign(tview.AlignLeft)
|
||||
form.SetInputCapture(func(event *tcell.EventKey) *tcell.EventKey {
|
||||
if event.Key() == tcell.KeyEscape {
|
||||
se.showExitConfirmation("edit-schema", "schema-editor")
|
||||
return nil
|
||||
}
|
||||
return event
|
||||
})
|
||||
|
||||
se.pages.AddPage("edit-schema", form, true, true)
|
||||
}
|
||||
88
pkg/ui/table_dataops.go
Normal file
88
pkg/ui/table_dataops.go
Normal file
@@ -0,0 +1,88 @@
|
||||
package ui
|
||||
|
||||
import "git.warky.dev/wdevs/relspecgo/pkg/models"
|
||||
|
||||
// Table data operations - business logic for table management
|
||||
|
||||
// CreateTable creates a new table and adds it to a schema
|
||||
func (se *SchemaEditor) CreateTable(schemaIndex int, name, description string) *models.Table {
|
||||
if schemaIndex < 0 || schemaIndex >= len(se.db.Schemas) {
|
||||
return nil
|
||||
}
|
||||
|
||||
schema := se.db.Schemas[schemaIndex]
|
||||
newTable := &models.Table{
|
||||
Name: name,
|
||||
Schema: schema.Name,
|
||||
Description: description,
|
||||
Columns: make(map[string]*models.Column),
|
||||
Constraints: make(map[string]*models.Constraint),
|
||||
Indexes: make(map[string]*models.Index),
|
||||
}
|
||||
schema.UpdateDate()
|
||||
schema.Tables = append(schema.Tables, newTable)
|
||||
return newTable
|
||||
}
|
||||
|
||||
// UpdateTable updates an existing table's properties
|
||||
func (se *SchemaEditor) UpdateTable(schemaIndex, tableIndex int, name, description string) {
|
||||
if schemaIndex < 0 || schemaIndex >= len(se.db.Schemas) {
|
||||
return
|
||||
}
|
||||
|
||||
schema := se.db.Schemas[schemaIndex]
|
||||
if tableIndex < 0 || tableIndex >= len(schema.Tables) {
|
||||
return
|
||||
}
|
||||
schema.UpdateDate()
|
||||
table := schema.Tables[tableIndex]
|
||||
table.Name = name
|
||||
table.Description = description
|
||||
table.UpdateDate()
|
||||
}
|
||||
|
||||
// DeleteTable removes a table from a schema
|
||||
func (se *SchemaEditor) DeleteTable(schemaIndex, tableIndex int) bool {
|
||||
if schemaIndex < 0 || schemaIndex >= len(se.db.Schemas) {
|
||||
return false
|
||||
}
|
||||
|
||||
schema := se.db.Schemas[schemaIndex]
|
||||
if tableIndex < 0 || tableIndex >= len(schema.Tables) {
|
||||
return false
|
||||
}
|
||||
schema.UpdateDate()
|
||||
schema.Tables = append(schema.Tables[:tableIndex], schema.Tables[tableIndex+1:]...)
|
||||
return true
|
||||
}
|
||||
|
||||
// GetTable returns a table by schema and table index
|
||||
func (se *SchemaEditor) GetTable(schemaIndex, tableIndex int) *models.Table {
|
||||
if schemaIndex < 0 || schemaIndex >= len(se.db.Schemas) {
|
||||
return nil
|
||||
}
|
||||
|
||||
schema := se.db.Schemas[schemaIndex]
|
||||
if tableIndex < 0 || tableIndex >= len(schema.Tables) {
|
||||
return nil
|
||||
}
|
||||
|
||||
return schema.Tables[tableIndex]
|
||||
}
|
||||
|
||||
// GetAllTables returns all tables across all schemas
|
||||
func (se *SchemaEditor) GetAllTables() []*models.Table {
|
||||
var tables []*models.Table
|
||||
for _, schema := range se.db.Schemas {
|
||||
tables = append(tables, schema.Tables...)
|
||||
}
|
||||
return tables
|
||||
}
|
||||
|
||||
// GetTablesInSchema returns all tables in a specific schema
|
||||
func (se *SchemaEditor) GetTablesInSchema(schemaIndex int) []*models.Table {
|
||||
if schemaIndex < 0 || schemaIndex >= len(se.db.Schemas) {
|
||||
return nil
|
||||
}
|
||||
return se.db.Schemas[schemaIndex].Tables
|
||||
}
|
||||
546
pkg/ui/table_screens.go
Normal file
546
pkg/ui/table_screens.go
Normal file
@@ -0,0 +1,546 @@
|
||||
package ui
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"strings"
|
||||
|
||||
"github.com/gdamore/tcell/v2"
|
||||
"github.com/rivo/tview"
|
||||
|
||||
"git.warky.dev/wdevs/relspecgo/pkg/models"
|
||||
)
|
||||
|
||||
// showTableList displays all tables across all schemas
|
||||
func (se *SchemaEditor) showTableList() {
|
||||
flex := tview.NewFlex().SetDirection(tview.FlexRow)
|
||||
|
||||
// Title
|
||||
title := tview.NewTextView().
|
||||
SetText("[::b]All Tables").
|
||||
SetDynamicColors(true).
|
||||
SetTextAlign(tview.AlignCenter)
|
||||
|
||||
// Create tables table
|
||||
tableTable := tview.NewTable().SetBorders(true).SetSelectable(true, false).SetFixed(1, 0)
|
||||
|
||||
// Add header row with padding for full width
|
||||
headers := []string{"Name", "Schema", "Sequence", "Total Columns", "Total Relations", "Total Indexes", "GUID", "Description", "Comment"}
|
||||
headerWidths := []int{18, 15, 12, 14, 15, 14, 36, 0, 12} // Description gets remainder
|
||||
for i, header := range headers {
|
||||
padding := ""
|
||||
if i < len(headerWidths) && headerWidths[i] > 0 {
|
||||
padding = strings.Repeat(" ", headerWidths[i]-len(header))
|
||||
}
|
||||
cell := tview.NewTableCell(header + padding).
|
||||
SetTextColor(tcell.ColorYellow).
|
||||
SetSelectable(false).
|
||||
SetAlign(tview.AlignLeft)
|
||||
tableTable.SetCell(0, i, cell)
|
||||
}
|
||||
|
||||
var tables []*models.Table
|
||||
var tableLocations []struct{ schemaIdx, tableIdx int }
|
||||
|
||||
for si, schema := range se.db.Schemas {
|
||||
for ti, table := range schema.Tables {
|
||||
tables = append(tables, table)
|
||||
tableLocations = append(tableLocations, struct{ schemaIdx, tableIdx int }{si, ti})
|
||||
}
|
||||
}
|
||||
|
||||
for row, table := range tables {
|
||||
tableIdx := tableLocations[row]
|
||||
schema := se.db.Schemas[tableIdx.schemaIdx]
|
||||
|
||||
// Name - pad to 18 chars
|
||||
nameStr := fmt.Sprintf("%-18s", table.Name)
|
||||
nameCell := tview.NewTableCell(nameStr).SetSelectable(true)
|
||||
tableTable.SetCell(row+1, 0, nameCell)
|
||||
|
||||
// Schema - pad to 15 chars
|
||||
schemaStr := fmt.Sprintf("%-15s", schema.Name)
|
||||
schemaCell := tview.NewTableCell(schemaStr).SetSelectable(true)
|
||||
tableTable.SetCell(row+1, 1, schemaCell)
|
||||
|
||||
// Sequence - pad to 12 chars
|
||||
seqStr := fmt.Sprintf("%-12s", fmt.Sprintf("%d", table.Sequence))
|
||||
seqCell := tview.NewTableCell(seqStr).SetSelectable(true)
|
||||
tableTable.SetCell(row+1, 2, seqCell)
|
||||
|
||||
// Total Columns - pad to 14 chars
|
||||
colsStr := fmt.Sprintf("%-14s", fmt.Sprintf("%d", len(table.Columns)))
|
||||
colsCell := tview.NewTableCell(colsStr).SetSelectable(true)
|
||||
tableTable.SetCell(row+1, 3, colsCell)
|
||||
|
||||
// Total Relations - pad to 15 chars
|
||||
relsStr := fmt.Sprintf("%-15s", fmt.Sprintf("%d", len(table.Relationships)))
|
||||
relsCell := tview.NewTableCell(relsStr).SetSelectable(true)
|
||||
tableTable.SetCell(row+1, 4, relsCell)
|
||||
|
||||
// Total Indexes - pad to 14 chars
|
||||
idxStr := fmt.Sprintf("%-14s", fmt.Sprintf("%d", len(table.Indexes)))
|
||||
idxCell := tview.NewTableCell(idxStr).SetSelectable(true)
|
||||
tableTable.SetCell(row+1, 5, idxCell)
|
||||
|
||||
// GUID - pad to 36 chars
|
||||
guidStr := fmt.Sprintf("%-36s", table.GUID)
|
||||
guidCell := tview.NewTableCell(guidStr).SetSelectable(true)
|
||||
tableTable.SetCell(row+1, 6, guidCell)
|
||||
|
||||
// Description - no padding, takes remaining space
|
||||
descCell := tview.NewTableCell(table.Description).SetSelectable(true)
|
||||
tableTable.SetCell(row+1, 7, descCell)
|
||||
|
||||
// Comment - pad to 12 chars
|
||||
commentStr := fmt.Sprintf("%-12s", table.Comment)
|
||||
commentCell := tview.NewTableCell(commentStr).SetSelectable(true)
|
||||
tableTable.SetCell(row+1, 8, commentCell)
|
||||
}
|
||||
|
||||
tableTable.SetTitle(" All Tables ").SetBorder(true).SetTitleAlign(tview.AlignLeft)
|
||||
|
||||
// Action buttons (define before input capture)
|
||||
btnFlex := tview.NewFlex()
|
||||
btnNewTable := tview.NewButton("New Table [n]").SetSelectedFunc(func() {
|
||||
se.showNewTableDialogFromList()
|
||||
})
|
||||
btnBack := tview.NewButton("Back [b]").SetSelectedFunc(func() {
|
||||
se.pages.SwitchToPage("main")
|
||||
se.pages.RemovePage("tables")
|
||||
})
|
||||
|
||||
// Set up button input captures for Tab/Shift+Tab navigation
|
||||
btnNewTable.SetInputCapture(func(event *tcell.EventKey) *tcell.EventKey {
|
||||
if event.Key() == tcell.KeyBacktab {
|
||||
se.app.SetFocus(tableTable)
|
||||
return nil
|
||||
}
|
||||
if event.Key() == tcell.KeyTab {
|
||||
se.app.SetFocus(btnBack)
|
||||
return nil
|
||||
}
|
||||
return event
|
||||
})
|
||||
|
||||
btnBack.SetInputCapture(func(event *tcell.EventKey) *tcell.EventKey {
|
||||
if event.Key() == tcell.KeyBacktab {
|
||||
se.app.SetFocus(btnNewTable)
|
||||
return nil
|
||||
}
|
||||
if event.Key() == tcell.KeyTab {
|
||||
se.app.SetFocus(tableTable)
|
||||
return nil
|
||||
}
|
||||
return event
|
||||
})
|
||||
|
||||
btnFlex.AddItem(btnNewTable, 0, 1, true).
|
||||
AddItem(btnBack, 0, 1, false)
|
||||
|
||||
tableTable.SetInputCapture(func(event *tcell.EventKey) *tcell.EventKey {
|
||||
if event.Key() == tcell.KeyEscape {
|
||||
se.pages.SwitchToPage("main")
|
||||
se.pages.RemovePage("tables")
|
||||
return nil
|
||||
}
|
||||
if event.Key() == tcell.KeyTab {
|
||||
se.app.SetFocus(btnNewTable)
|
||||
return nil
|
||||
}
|
||||
if event.Key() == tcell.KeyEnter {
|
||||
row, _ := tableTable.GetSelection()
|
||||
if row > 0 && row <= len(tables) { // Skip header row
|
||||
tableIdx := tableLocations[row-1]
|
||||
se.showTableEditor(tableIdx.schemaIdx, tableIdx.tableIdx, tables[row-1])
|
||||
return nil
|
||||
}
|
||||
}
|
||||
if event.Rune() == 'n' {
|
||||
se.showNewTableDialogFromList()
|
||||
return nil
|
||||
}
|
||||
if event.Rune() == 'b' {
|
||||
se.pages.SwitchToPage("main")
|
||||
se.pages.RemovePage("tables")
|
||||
return nil
|
||||
}
|
||||
return event
|
||||
})
|
||||
|
||||
flex.AddItem(title, 1, 0, false).
|
||||
AddItem(tableTable, 0, 1, true).
|
||||
AddItem(btnFlex, 1, 0, false)
|
||||
|
||||
se.pages.AddPage("tables", flex, true, true)
|
||||
}
|
||||
|
||||
// showTableEditor shows editor for a specific table
|
||||
func (se *SchemaEditor) showTableEditor(schemaIndex, tableIndex int, table *models.Table) {
|
||||
flex := tview.NewFlex().SetDirection(tview.FlexRow)
|
||||
|
||||
// Title
|
||||
title := tview.NewTextView().
|
||||
SetText(fmt.Sprintf("[::b]Table: %s", table.Name)).
|
||||
SetDynamicColors(true).
|
||||
SetTextAlign(tview.AlignCenter)
|
||||
|
||||
// Table info
|
||||
info := tview.NewTextView().SetDynamicColors(true)
|
||||
info.SetText(fmt.Sprintf("Schema: %s | Columns: %d | Description: %s",
|
||||
table.Schema, len(table.Columns), table.Description))
|
||||
|
||||
// Create columns table
|
||||
colTable := tview.NewTable().SetBorders(true).SetSelectable(true, false).SetFixed(1, 0)
|
||||
|
||||
// Add header row with padding for full width
|
||||
headers := []string{"Name", "Type", "Default", "KeyType", "GUID", "Description"}
|
||||
headerWidths := []int{20, 18, 15, 15, 36} // Last column takes remaining space
|
||||
for i, header := range headers {
|
||||
padding := ""
|
||||
if i < len(headerWidths) {
|
||||
padding = strings.Repeat(" ", headerWidths[i]-len(header))
|
||||
}
|
||||
cell := tview.NewTableCell(header + padding).
|
||||
SetTextColor(tcell.ColorYellow).
|
||||
SetSelectable(false).
|
||||
SetAlign(tview.AlignLeft)
|
||||
colTable.SetCell(0, i, cell)
|
||||
}
|
||||
|
||||
// Get sorted column names
|
||||
columnNames := getColumnNames(table)
|
||||
for row, colName := range columnNames {
|
||||
column := table.Columns[colName]
|
||||
|
||||
// Name - pad to 20 chars
|
||||
nameStr := fmt.Sprintf("%-20s", colName)
|
||||
nameCell := tview.NewTableCell(nameStr).SetSelectable(true)
|
||||
colTable.SetCell(row+1, 0, nameCell)
|
||||
|
||||
// Type - pad to 18 chars
|
||||
typeStr := fmt.Sprintf("%-18s", column.Type)
|
||||
typeCell := tview.NewTableCell(typeStr).SetSelectable(true)
|
||||
colTable.SetCell(row+1, 1, typeCell)
|
||||
|
||||
// Default - pad to 15 chars
|
||||
defaultStr := ""
|
||||
if column.Default != nil {
|
||||
defaultStr = fmt.Sprintf("%v", column.Default)
|
||||
}
|
||||
defaultStr = fmt.Sprintf("%-15s", defaultStr)
|
||||
defaultCell := tview.NewTableCell(defaultStr).SetSelectable(true)
|
||||
colTable.SetCell(row+1, 2, defaultCell)
|
||||
|
||||
// KeyType - pad to 15 chars
|
||||
keyTypeStr := ""
|
||||
if column.IsPrimaryKey {
|
||||
keyTypeStr = "PRIMARY"
|
||||
} else if column.NotNull {
|
||||
keyTypeStr = "NOT NULL"
|
||||
}
|
||||
keyTypeStr = fmt.Sprintf("%-15s", keyTypeStr)
|
||||
keyTypeCell := tview.NewTableCell(keyTypeStr).SetSelectable(true)
|
||||
colTable.SetCell(row+1, 3, keyTypeCell)
|
||||
|
||||
// GUID - pad to 36 chars
|
||||
guidStr := fmt.Sprintf("%-36s", column.GUID)
|
||||
guidCell := tview.NewTableCell(guidStr).SetSelectable(true)
|
||||
colTable.SetCell(row+1, 4, guidCell)
|
||||
|
||||
// Description
|
||||
descCell := tview.NewTableCell(column.Description).SetSelectable(true)
|
||||
colTable.SetCell(row+1, 5, descCell)
|
||||
}
|
||||
|
||||
colTable.SetTitle(" Columns ").SetBorder(true).SetTitleAlign(tview.AlignLeft)
|
||||
|
||||
// Action buttons flex (define before input capture)
|
||||
btnFlex := tview.NewFlex()
|
||||
btnNewCol := tview.NewButton("Add Column [n]").SetSelectedFunc(func() {
|
||||
se.showNewColumnDialog(schemaIndex, tableIndex)
|
||||
})
|
||||
btnEditTable := tview.NewButton("Edit Table [e]").SetSelectedFunc(func() {
|
||||
se.showEditTableDialog(schemaIndex, tableIndex)
|
||||
})
|
||||
btnEditColumn := tview.NewButton("Edit Column [c]").SetSelectedFunc(func() {
|
||||
row, _ := colTable.GetSelection()
|
||||
if row > 0 && row <= len(columnNames) { // Skip header row
|
||||
colName := columnNames[row-1]
|
||||
column := table.Columns[colName]
|
||||
se.showColumnEditor(schemaIndex, tableIndex, row-1, column)
|
||||
}
|
||||
})
|
||||
btnDelTable := tview.NewButton("Delete Table [d]").SetSelectedFunc(func() {
|
||||
se.showDeleteTableConfirm(schemaIndex, tableIndex)
|
||||
})
|
||||
btnBack := tview.NewButton("Back to Schema [b]").SetSelectedFunc(func() {
|
||||
se.pages.RemovePage("table-editor")
|
||||
se.pages.SwitchToPage("schema-editor")
|
||||
})
|
||||
|
||||
// Set up button input captures for Tab/Shift+Tab navigation
|
||||
btnNewCol.SetInputCapture(func(event *tcell.EventKey) *tcell.EventKey {
|
||||
if event.Key() == tcell.KeyBacktab {
|
||||
se.app.SetFocus(colTable)
|
||||
return nil
|
||||
}
|
||||
if event.Key() == tcell.KeyTab {
|
||||
se.app.SetFocus(btnEditColumn)
|
||||
return nil
|
||||
}
|
||||
return event
|
||||
})
|
||||
|
||||
btnEditColumn.SetInputCapture(func(event *tcell.EventKey) *tcell.EventKey {
|
||||
if event.Key() == tcell.KeyBacktab {
|
||||
se.app.SetFocus(btnNewCol)
|
||||
return nil
|
||||
}
|
||||
if event.Key() == tcell.KeyTab {
|
||||
se.app.SetFocus(btnEditTable)
|
||||
return nil
|
||||
}
|
||||
return event
|
||||
})
|
||||
|
||||
btnEditTable.SetInputCapture(func(event *tcell.EventKey) *tcell.EventKey {
|
||||
if event.Key() == tcell.KeyBacktab {
|
||||
se.app.SetFocus(btnEditColumn)
|
||||
return nil
|
||||
}
|
||||
if event.Key() == tcell.KeyTab {
|
||||
se.app.SetFocus(btnDelTable)
|
||||
return nil
|
||||
}
|
||||
return event
|
||||
})
|
||||
|
||||
btnDelTable.SetInputCapture(func(event *tcell.EventKey) *tcell.EventKey {
|
||||
if event.Key() == tcell.KeyBacktab {
|
||||
se.app.SetFocus(btnEditTable)
|
||||
return nil
|
||||
}
|
||||
if event.Key() == tcell.KeyTab {
|
||||
se.app.SetFocus(btnBack)
|
||||
return nil
|
||||
}
|
||||
return event
|
||||
})
|
||||
|
||||
btnBack.SetInputCapture(func(event *tcell.EventKey) *tcell.EventKey {
|
||||
if event.Key() == tcell.KeyBacktab {
|
||||
se.app.SetFocus(btnDelTable)
|
||||
return nil
|
||||
}
|
||||
if event.Key() == tcell.KeyTab {
|
||||
se.app.SetFocus(colTable)
|
||||
return nil
|
||||
}
|
||||
return event
|
||||
})
|
||||
|
||||
btnFlex.AddItem(btnNewCol, 0, 1, true).
|
||||
AddItem(btnEditColumn, 0, 1, false).
|
||||
AddItem(btnEditTable, 0, 1, false).
|
||||
AddItem(btnDelTable, 0, 1, false).
|
||||
AddItem(btnBack, 0, 1, false)
|
||||
|
||||
colTable.SetInputCapture(func(event *tcell.EventKey) *tcell.EventKey {
|
||||
if event.Key() == tcell.KeyEscape {
|
||||
se.pages.SwitchToPage("schema-editor")
|
||||
se.pages.RemovePage("table-editor")
|
||||
return nil
|
||||
}
|
||||
if event.Key() == tcell.KeyTab {
|
||||
se.app.SetFocus(btnNewCol)
|
||||
return nil
|
||||
}
|
||||
if event.Key() == tcell.KeyEnter {
|
||||
row, _ := colTable.GetSelection()
|
||||
if row > 0 { // Skip header row
|
||||
colName := columnNames[row-1]
|
||||
column := table.Columns[colName]
|
||||
se.showColumnEditor(schemaIndex, tableIndex, row-1, column)
|
||||
return nil
|
||||
}
|
||||
}
|
||||
if event.Rune() == 'c' {
|
||||
row, _ := colTable.GetSelection()
|
||||
if row > 0 && row <= len(columnNames) { // Skip header row
|
||||
colName := columnNames[row-1]
|
||||
column := table.Columns[colName]
|
||||
se.showColumnEditor(schemaIndex, tableIndex, row-1, column)
|
||||
}
|
||||
return nil
|
||||
}
|
||||
if event.Rune() == 'b' {
|
||||
se.pages.RemovePage("table-editor")
|
||||
se.pages.SwitchToPage("schema-editor")
|
||||
return nil
|
||||
}
|
||||
return event
|
||||
})
|
||||
|
||||
flex.AddItem(title, 1, 0, false).
|
||||
AddItem(info, 2, 0, false).
|
||||
AddItem(colTable, 0, 1, true).
|
||||
AddItem(btnFlex, 1, 0, false)
|
||||
|
||||
se.pages.AddPage("table-editor", flex, true, true)
|
||||
}
|
||||
|
||||
// showNewTableDialog shows dialog to create a new table
|
||||
func (se *SchemaEditor) showNewTableDialog(schemaIndex int) {
|
||||
form := tview.NewForm()
|
||||
|
||||
tableName := ""
|
||||
description := ""
|
||||
|
||||
form.AddInputField("Table Name", "", 40, nil, func(value string) {
|
||||
tableName = value
|
||||
})
|
||||
|
||||
form.AddInputField("Description", "", 40, nil, func(value string) {
|
||||
description = value
|
||||
})
|
||||
|
||||
form.AddButton("Save", func() {
|
||||
if tableName == "" {
|
||||
return
|
||||
}
|
||||
|
||||
se.CreateTable(schemaIndex, tableName, description)
|
||||
|
||||
schema := se.db.Schemas[schemaIndex]
|
||||
se.pages.RemovePage("new-table")
|
||||
se.pages.RemovePage("schema-editor")
|
||||
se.showSchemaEditor(schemaIndex, schema)
|
||||
})
|
||||
|
||||
form.AddButton("Back", func() {
|
||||
schema := se.db.Schemas[schemaIndex]
|
||||
se.pages.RemovePage("new-table")
|
||||
se.pages.RemovePage("schema-editor")
|
||||
se.showSchemaEditor(schemaIndex, schema)
|
||||
})
|
||||
|
||||
form.SetBorder(true).SetTitle(" New Table ").SetTitleAlign(tview.AlignLeft)
|
||||
form.SetInputCapture(func(event *tcell.EventKey) *tcell.EventKey {
|
||||
if event.Key() == tcell.KeyEscape {
|
||||
se.showExitConfirmation("new-table", "schema-editor")
|
||||
return nil
|
||||
}
|
||||
return event
|
||||
})
|
||||
|
||||
se.pages.AddPage("new-table", form, true, true)
|
||||
}
|
||||
|
||||
// showNewTableDialogFromList shows dialog to create a new table with schema selection
|
||||
func (se *SchemaEditor) showNewTableDialogFromList() {
|
||||
form := tview.NewForm()
|
||||
|
||||
tableName := ""
|
||||
description := ""
|
||||
selectedSchemaIdx := 0
|
||||
|
||||
// Create schema dropdown options
|
||||
schemaOptions := make([]string, len(se.db.Schemas))
|
||||
for i, schema := range se.db.Schemas {
|
||||
schemaOptions[i] = schema.Name
|
||||
}
|
||||
|
||||
form.AddInputField("Table Name", "", 40, nil, func(value string) {
|
||||
tableName = value
|
||||
})
|
||||
|
||||
form.AddDropDown("Schema", schemaOptions, 0, func(option string, optionIndex int) {
|
||||
selectedSchemaIdx = optionIndex
|
||||
})
|
||||
|
||||
form.AddInputField("Description", "", 40, nil, func(value string) {
|
||||
description = value
|
||||
})
|
||||
|
||||
form.AddButton("Save", func() {
|
||||
if tableName == "" {
|
||||
return
|
||||
}
|
||||
|
||||
se.CreateTable(selectedSchemaIdx, tableName, description)
|
||||
|
||||
se.pages.RemovePage("new-table-from-list")
|
||||
se.pages.RemovePage("tables")
|
||||
se.showTableList()
|
||||
})
|
||||
|
||||
form.AddButton("Back", func() {
|
||||
se.pages.RemovePage("new-table-from-list")
|
||||
se.pages.RemovePage("tables")
|
||||
se.showTableList()
|
||||
})
|
||||
|
||||
form.SetBorder(true).SetTitle(" New Table ").SetTitleAlign(tview.AlignLeft)
|
||||
form.SetInputCapture(func(event *tcell.EventKey) *tcell.EventKey {
|
||||
if event.Key() == tcell.KeyEscape {
|
||||
se.showExitConfirmation("new-table-from-list", "tables")
|
||||
return nil
|
||||
}
|
||||
return event
|
||||
})
|
||||
|
||||
se.pages.AddPage("new-table-from-list", form, true, true)
|
||||
}
|
||||
|
||||
// showEditTableDialog shows dialog to edit table properties
|
||||
func (se *SchemaEditor) showEditTableDialog(schemaIndex, tableIndex int) {
|
||||
table := se.db.Schemas[schemaIndex].Tables[tableIndex]
|
||||
form := tview.NewForm()
|
||||
|
||||
// Local variables to collect changes
|
||||
newName := table.Name
|
||||
newDescription := table.Description
|
||||
newGUID := table.GUID
|
||||
|
||||
form.AddInputField("Table Name", table.Name, 40, nil, func(value string) {
|
||||
newName = value
|
||||
})
|
||||
|
||||
form.AddTextArea("Description", table.Description, 40, 5, 0, func(value string) {
|
||||
newDescription = value
|
||||
})
|
||||
|
||||
form.AddInputField("GUID", table.GUID, 40, nil, func(value string) {
|
||||
newGUID = value
|
||||
})
|
||||
|
||||
form.AddButton("Save", func() {
|
||||
// Apply changes using dataops
|
||||
se.UpdateTable(schemaIndex, tableIndex, newName, newDescription)
|
||||
se.db.Schemas[schemaIndex].Tables[tableIndex].GUID = newGUID
|
||||
|
||||
table := se.db.Schemas[schemaIndex].Tables[tableIndex]
|
||||
se.pages.RemovePage("edit-table")
|
||||
se.pages.RemovePage("table-editor")
|
||||
se.showTableEditor(schemaIndex, tableIndex, table)
|
||||
})
|
||||
|
||||
form.AddButton("Back", func() {
|
||||
// Discard changes - don't apply them
|
||||
table := se.db.Schemas[schemaIndex].Tables[tableIndex]
|
||||
se.pages.RemovePage("edit-table")
|
||||
se.pages.RemovePage("table-editor")
|
||||
se.showTableEditor(schemaIndex, tableIndex, table)
|
||||
})
|
||||
|
||||
form.SetBorder(true).SetTitle(" Edit Table ").SetTitleAlign(tview.AlignLeft)
|
||||
form.SetInputCapture(func(event *tcell.EventKey) *tcell.EventKey {
|
||||
if event.Key() == tcell.KeyEscape {
|
||||
se.showExitConfirmation("edit-table", "table-editor")
|
||||
return nil
|
||||
}
|
||||
return event
|
||||
})
|
||||
|
||||
se.pages.AddPage("edit-table", form, true, true)
|
||||
}
|
||||
299
pkg/ui/ui_rules.md
Normal file
299
pkg/ui/ui_rules.md
Normal file
@@ -0,0 +1,299 @@
|
||||
# UI Rules and Guidelines
|
||||
|
||||
## Layout Requirements
|
||||
|
||||
All layouts / forms must be in seperate files regarding their domain or entity.
|
||||
|
||||
### Screen Layout Structure
|
||||
|
||||
All screens must follow this consistent layout:
|
||||
|
||||
1. **Title at the Top** (1 row, fixed height)
|
||||
- Centered bold text: `[::b]Title Text`
|
||||
- Use `tview.NewTextView()` with `SetTextAlign(tview.AlignCenter)`
|
||||
- Enable dynamic colors: `SetDynamicColors(true)`
|
||||
|
||||
2. **Content in the Middle** (flexible height)
|
||||
- Tables, lists, forms, or info displays
|
||||
- Uses flex weight of 1 for dynamic sizing
|
||||
|
||||
3. **Action Buttons at the Bottom** (1 row, fixed height)
|
||||
- Must be in a horizontal flex container
|
||||
- Action buttons before Back button
|
||||
- Back button is always last
|
||||
|
||||
### Form Layout Structure
|
||||
|
||||
All forms must follow this button order:
|
||||
|
||||
1. **Save Button** (always first)
|
||||
- Label: "Save"
|
||||
- Primary action that commits changes
|
||||
|
||||
2. **Delete Button** (optional, only for edit forms)
|
||||
- Label: "Delete"
|
||||
- Only shown when editing existing items (not for new items)
|
||||
- Must show confirmation dialog before deletion
|
||||
|
||||
3. **Back Button** (always last)
|
||||
- Label: "Back"
|
||||
- Returns to previous screen without saving
|
||||
|
||||
**Button Order Examples:**
|
||||
- **New Item Forms:** Save, Back
|
||||
- **Edit Item Forms:** Save, Delete, Back
|
||||
|
||||
## Tab Navigation
|
||||
|
||||
All screens must implement circular tab navigation:
|
||||
|
||||
1. **Tab Key** - Moves focus to the next focusable element
|
||||
2. **Shift+Tab (BackTab)** - Moves focus to the previous focusable element
|
||||
3. **At the End** - Tab cycles back to the first element
|
||||
4. **At the Start** - Shift+Tab cycles back to the last element
|
||||
|
||||
**Navigation Flow Pattern:**
|
||||
- Each widget must handle both Tab and BackTab
|
||||
- First widget: BackTab → Last widget, Tab → Second widget
|
||||
- Middle widgets: BackTab → Previous widget, Tab → Next widget
|
||||
- Last widget: BackTab → Previous widget, Tab → First widget
|
||||
|
||||
## Keyboard Shortcuts
|
||||
|
||||
### Standard Keys
|
||||
|
||||
- **ESC** - Cancel current operation or go back to previous screen
|
||||
- **Tab** - Move focus forward (circular)
|
||||
- **Shift+Tab** - Move focus backward (circular)
|
||||
- **Enter** - Activate/select current item in tables and lists
|
||||
|
||||
### Letter Key Shortcuts
|
||||
|
||||
- **'n'** - New (create new item)
|
||||
- **'b'** - Back (return to previous screen)
|
||||
- **'e'** - Edit (edit current item)
|
||||
- **'d'** - Delete (delete current item)
|
||||
- **'c'** - Edit Column (in table editor)
|
||||
- **'s'** - Manage Schemas (in main menu)
|
||||
- **'t'** - Manage Tables (in main menu)
|
||||
- **'q'** - Quit/Exit (in main menu)
|
||||
|
||||
## Consistency Requirements
|
||||
|
||||
1. **Layout Structure** - All screens: Title (top) → Content (middle) → Buttons (bottom)
|
||||
2. **Title Format** - Bold (`[::b]`), centered, dynamic colors enabled
|
||||
3. **Tables** - Fixed headers (row 0), borders enabled, selectable rows
|
||||
4. **Buttons** - Include keyboard shortcuts in labels (e.g., "Back [b]")
|
||||
5. **Forms** - Button order: Save, Delete (if edit), Back
|
||||
6. **Destructive Actions** - Always show confirmation dialogs
|
||||
7. **ESC Key** - All screens support ESC to go back
|
||||
8. **Action Buttons** - Positioned before Back button, in logical order
|
||||
9. **Data Refresh** - Always refresh the previous screen when returning from a form or dialog
|
||||
|
||||
## Widget Naming Conventions
|
||||
|
||||
- **Tables:** `schemaTable`, `tableTable`, `colTable`
|
||||
- **Buttons:** Prefix with `btn` (e.g., `btnBack`, `btnDelete`, `btnNewSchema`)
|
||||
- **Flex containers:** `btnFlex` for button containers, `flex` for main layout
|
||||
- **Forms:** `form`
|
||||
- **Lists:** `list`, `tableList`
|
||||
- **Text views:** `title`, `info`
|
||||
- Use camelCase for all variable names
|
||||
|
||||
## Page Naming Conventions
|
||||
|
||||
Use descriptive kebab-case names:
|
||||
|
||||
- **Main screens:** `main`, `schemas`, `tables`, `schema-editor`, `table-editor`, `column-editor`
|
||||
- **Load/Save screens:** `load-database`, `save-database`
|
||||
- **Creation dialogs:** `new-schema`, `new-table`, `new-column`, `new-table-from-list`
|
||||
- **Edit dialogs:** `edit-schema`, `edit-table`
|
||||
- **Confirmations:** `confirm-delete-schema`, `confirm-delete-table`, `confirm-delete-column`
|
||||
- **Exit confirmations:** `exit-confirm`, `exit-editor-confirm`
|
||||
- **Status dialogs:** `error-dialog`, `success-dialog`
|
||||
|
||||
## Dialog and Confirmation Rules
|
||||
|
||||
### Confirmation Dialogs
|
||||
|
||||
1. **Delete Confirmations** - Required for all destructive actions
|
||||
- Show item name in confirmation text
|
||||
- Buttons: "Cancel", "Delete"
|
||||
- ESC key dismisses dialog
|
||||
|
||||
2. **Exit Confirmations** - Required when exiting forms with potential unsaved changes
|
||||
- Text: "Exit without saving changes?"
|
||||
- Buttons: "Cancel", "No, exit without saving"
|
||||
- ESC key confirms exit
|
||||
|
||||
3. **Save Confirmations** - Optional, based on context
|
||||
- Use for critical data changes
|
||||
- Clear description of what will be saved
|
||||
|
||||
### Dialog Behavior
|
||||
|
||||
- All dialogs must capture ESC key for dismissal
|
||||
- Modal dialogs overlay current screen
|
||||
- Confirmation dialogs use `tview.NewModal()`
|
||||
- Remove dialog page after action completes
|
||||
|
||||
## Data Refresh Rules
|
||||
|
||||
When returning from any form or dialog, the previous screen must be refreshed to show updated data. If Tables exists in the screen, their data must be updated:
|
||||
|
||||
1. **After Save** - Remove and recreate the previous screen to display updated data
|
||||
2. **After Delete** - Remove and recreate the previous screen to display remaining data
|
||||
3. **After Cancel/Back** - Remove and recreate the previous screen (data may have changed)
|
||||
4. **Implementation Pattern** - Remove the current page, remove the previous page, then recreate the previous page with fresh data
|
||||
|
||||
**Why This Matters:**
|
||||
- Ensures users see their changes immediately
|
||||
- Prevents stale data from being displayed
|
||||
- Maintains data consistency across the UI
|
||||
- Avoids confusion from seeing outdated information
|
||||
|
||||
**Example Flow:**
|
||||
```
|
||||
User on Schema List → Opens Edit Schema Form → Saves Changes →
|
||||
Returns to Schema List (refreshed with updated schema data)
|
||||
```
|
||||
|
||||
## Big Loading/Saving Operations
|
||||
|
||||
When loading big changes, files or data, always give a load completed or load error dialog.
|
||||
Do the same with saving.
|
||||
This informs the user what happens.
|
||||
When data is dirty, always ask the user to save when trying to exit.
|
||||
|
||||
### Load/Save Screens
|
||||
|
||||
- **Load Screen** (`load-database`) - Shown when no source is specified via command line
|
||||
- Format dropdown (dbml, dctx, drawdb, graphql, json, yaml, gorm, bun, drizzle, prisma, typeorm, pgsql)
|
||||
- File path input (for file-based formats)
|
||||
- Connection string input (for database formats like pgsql)
|
||||
- Load button [l] - Loads the database
|
||||
- Create New button [n] - Creates a new empty database
|
||||
- Exit button [q] - Exits the application
|
||||
- ESC key exits the application
|
||||
|
||||
- **Save Screen** (`save-database`) - Accessible from main menu with 'w' key
|
||||
- Format dropdown (same as load screen)
|
||||
- File path input
|
||||
- Help text explaining format requirements
|
||||
- Save button [s] - Saves the database
|
||||
- Back button [b] - Returns to main menu
|
||||
- ESC key returns to main menu
|
||||
- Pre-populated with existing save configuration if available
|
||||
|
||||
### Status Dialogs
|
||||
|
||||
- **Error Dialog** (`error-dialog`) - Shows error messages with OK button and ESC to dismiss
|
||||
- **Success Dialog** (`success-dialog`) - Shows success messages with OK button, ESC to dismiss, and optional callback on close
|
||||
|
||||
## Screen Organization
|
||||
|
||||
Organize UI code into these files:
|
||||
|
||||
### UI Files (Screens and Dialogs)
|
||||
|
||||
- **editor.go** - Core `SchemaEditor` struct, constructor, `Run()` method, helper functions
|
||||
- **main_menu.go** - Main menu screen
|
||||
- **load_save_screens.go** - Database load and save screens
|
||||
- **database_screens.go** - Database edit form
|
||||
- **schema_screens.go** - Schema list, schema editor, new/edit schema dialogs
|
||||
- **table_screens.go** - Tables list, table editor, new/edit table dialogs
|
||||
- **column_screens.go** - Column editor, new column dialog
|
||||
- **domain_screens.go** - Domain list, domain editor, new/edit domain dialogs
|
||||
- **dialogs.go** - Confirmation dialogs (exit, delete)
|
||||
|
||||
### Data Operations Files (Business Logic)
|
||||
|
||||
- **schema_dataops.go** - Schema CRUD operations (Create, Read, Update, Delete)
|
||||
- **table_dataops.go** - Table CRUD operations
|
||||
- **column_dataops.go** - Column CRUD operations
|
||||
|
||||
## Code Separation Rules
|
||||
|
||||
### UI vs Business Logic
|
||||
|
||||
1. **UI Files** - Handle only presentation and user interaction
|
||||
- Display data in tables, lists, and forms
|
||||
- Capture user input
|
||||
- Navigate between screens
|
||||
- Show/hide dialogs
|
||||
- Call dataops methods for actual data changes
|
||||
|
||||
2. **Dataops Files** - Handle only business logic and data manipulation
|
||||
- Create, read, update, delete operations
|
||||
- Data validation
|
||||
- Data structure manipulation
|
||||
- Return created/updated objects or success/failure status
|
||||
- No UI code or tview references
|
||||
|
||||
### Implementation Pattern
|
||||
|
||||
#### Creating New Items
|
||||
|
||||
**Bad (Direct Data Manipulation in UI):**
|
||||
```go
|
||||
form.AddButton("Save", func() {
|
||||
schema := &models.Schema{Name: name, Description: desc, ...}
|
||||
se.db.Schemas = append(se.db.Schemas, schema)
|
||||
})
|
||||
```
|
||||
|
||||
**Good (Using Dataops Methods):**
|
||||
```go
|
||||
form.AddButton("Save", func() {
|
||||
se.CreateSchema(name, description)
|
||||
})
|
||||
```
|
||||
|
||||
#### Editing Existing Items
|
||||
|
||||
**Bad (Modifying Data in onChange Callbacks):**
|
||||
```go
|
||||
form.AddInputField("Name", column.Name, 40, nil, func(value string) {
|
||||
column.Name = value // Changes immediately as user types!
|
||||
})
|
||||
form.AddButton("Save", func() {
|
||||
// Data already changed, just refresh screen
|
||||
})
|
||||
```
|
||||
|
||||
**Good (Local Variables + Dataops on Save):**
|
||||
```go
|
||||
// Store original values
|
||||
originalName := column.Name
|
||||
newName := column.Name
|
||||
|
||||
form.AddInputField("Name", column.Name, 40, nil, func(value string) {
|
||||
newName = value // Store in local variable
|
||||
})
|
||||
|
||||
form.AddButton("Save", func() {
|
||||
// Apply changes only when Save is clicked
|
||||
se.UpdateColumn(schemaIndex, tableIndex, originalName, newName, ...)
|
||||
// Then refresh screen
|
||||
})
|
||||
|
||||
form.AddButton("Back", func() {
|
||||
// Discard changes - don't apply local variables
|
||||
// Just refresh screen
|
||||
})
|
||||
```
|
||||
|
||||
### Why This Matters
|
||||
|
||||
**Edit Forms Must Use Local Variables:**
|
||||
1. **Deferred Changes** - Changes only apply when Save is clicked
|
||||
2. **Cancellable** - Back button discards changes without saving
|
||||
3. **Handles Renames** - Original name preserved to update map keys correctly
|
||||
4. **User Expectations** - Save means "commit changes", Back means "cancel"
|
||||
|
||||
This separation ensures:
|
||||
- Cleaner, more maintainable code
|
||||
- Reusable business logic
|
||||
- Easier testing
|
||||
- Clear separation of concerns
|
||||
- Proper change management (save vs cancel)
|
||||
@@ -5,6 +5,7 @@ import (
|
||||
"strings"
|
||||
|
||||
"git.warky.dev/wdevs/relspecgo/pkg/models"
|
||||
"git.warky.dev/wdevs/relspecgo/pkg/writers"
|
||||
)
|
||||
|
||||
// TemplateData represents the data passed to the template for code generation
|
||||
@@ -133,8 +134,10 @@ func NewModelData(table *models.Table, schema string, typeMapper *TypeMapper) *M
|
||||
// Find primary key
|
||||
for _, col := range table.Columns {
|
||||
if col.IsPrimaryKey {
|
||||
model.PrimaryKeyField = SnakeCaseToPascalCase(col.Name)
|
||||
model.IDColumnName = col.Name
|
||||
// Sanitize column name to remove backticks
|
||||
safeName := writers.SanitizeStructTagValue(col.Name)
|
||||
model.PrimaryKeyField = SnakeCaseToPascalCase(safeName)
|
||||
model.IDColumnName = safeName
|
||||
// Check if PK type is a SQL type (contains resolvespec_common or sql_types)
|
||||
goType := typeMapper.SQLTypeToGoType(col.Type, col.NotNull)
|
||||
model.PrimaryKeyIsSQL = strings.Contains(goType, "resolvespec_common") || strings.Contains(goType, "sql_types")
|
||||
@@ -146,6 +149,8 @@ func NewModelData(table *models.Table, schema string, typeMapper *TypeMapper) *M
|
||||
columns := sortColumns(table.Columns)
|
||||
for _, col := range columns {
|
||||
field := columnToField(col, table, typeMapper)
|
||||
// Check for name collision with generated methods and rename if needed
|
||||
field.Name = resolveFieldNameCollision(field.Name)
|
||||
model.Fields = append(model.Fields, field)
|
||||
}
|
||||
|
||||
@@ -154,10 +159,13 @@ func NewModelData(table *models.Table, schema string, typeMapper *TypeMapper) *M
|
||||
|
||||
// columnToField converts a models.Column to FieldData
|
||||
func columnToField(col *models.Column, table *models.Table, typeMapper *TypeMapper) *FieldData {
|
||||
fieldName := SnakeCaseToPascalCase(col.Name)
|
||||
// Sanitize column name first to remove backticks before generating field name
|
||||
safeName := writers.SanitizeStructTagValue(col.Name)
|
||||
fieldName := SnakeCaseToPascalCase(safeName)
|
||||
goType := typeMapper.SQLTypeToGoType(col.Type, col.NotNull)
|
||||
bunTag := typeMapper.BuildBunTag(col, table)
|
||||
jsonTag := col.Name // Use column name for JSON tag
|
||||
// Use same sanitized name for JSON tag
|
||||
jsonTag := safeName
|
||||
|
||||
return &FieldData{
|
||||
Name: fieldName,
|
||||
@@ -189,6 +197,30 @@ func hasModelPrefix(name string) bool {
|
||||
return len(name) >= 5 && name[:5] == "Model"
|
||||
}
|
||||
|
||||
// resolveFieldNameCollision checks if a field name conflicts with generated method names
|
||||
// and adds an underscore suffix if there's a collision
|
||||
func resolveFieldNameCollision(fieldName string) string {
|
||||
// List of method names that are generated by the template
|
||||
reservedNames := map[string]bool{
|
||||
"TableName": true,
|
||||
"TableNameOnly": true,
|
||||
"SchemaName": true,
|
||||
"GetID": true,
|
||||
"GetIDStr": true,
|
||||
"SetID": true,
|
||||
"UpdateID": true,
|
||||
"GetIDName": true,
|
||||
"GetPrefix": true,
|
||||
}
|
||||
|
||||
// Check if field name conflicts with a reserved method name
|
||||
if reservedNames[fieldName] {
|
||||
return fieldName + "_"
|
||||
}
|
||||
|
||||
return fieldName
|
||||
}
|
||||
|
||||
// sortColumns sorts columns by sequence, then by name
|
||||
func sortColumns(columns map[string]*models.Column) []*models.Column {
|
||||
result := make([]*models.Column, 0, len(columns))
|
||||
|
||||
@@ -5,6 +5,7 @@ import (
|
||||
"strings"
|
||||
|
||||
"git.warky.dev/wdevs/relspecgo/pkg/models"
|
||||
"git.warky.dev/wdevs/relspecgo/pkg/writers"
|
||||
)
|
||||
|
||||
// TypeMapper handles type conversions between SQL and Go types for Bun
|
||||
@@ -164,11 +165,14 @@ func (tm *TypeMapper) BuildBunTag(column *models.Column, table *models.Table) st
|
||||
var parts []string
|
||||
|
||||
// Column name comes first (no prefix)
|
||||
parts = append(parts, column.Name)
|
||||
// Sanitize to remove backticks which would break struct tag syntax
|
||||
safeName := writers.SanitizeStructTagValue(column.Name)
|
||||
parts = append(parts, safeName)
|
||||
|
||||
// Add type if specified
|
||||
if column.Type != "" {
|
||||
typeStr := column.Type
|
||||
// Sanitize type to remove backticks
|
||||
typeStr := writers.SanitizeStructTagValue(column.Type)
|
||||
if column.Length > 0 {
|
||||
typeStr = fmt.Sprintf("%s(%d)", typeStr, column.Length)
|
||||
} else if column.Precision > 0 {
|
||||
@@ -188,12 +192,17 @@ func (tm *TypeMapper) BuildBunTag(column *models.Column, table *models.Table) st
|
||||
|
||||
// Default value
|
||||
if column.Default != nil {
|
||||
parts = append(parts, fmt.Sprintf("default:%v", column.Default))
|
||||
// Sanitize default value to remove backticks
|
||||
safeDefault := writers.SanitizeStructTagValue(fmt.Sprintf("%v", column.Default))
|
||||
parts = append(parts, fmt.Sprintf("default:%s", safeDefault))
|
||||
}
|
||||
|
||||
// Nullable (Bun uses nullzero for nullable fields)
|
||||
// and notnull tag for explicitly non-nullable fields
|
||||
if !column.NotNull && !column.IsPrimaryKey {
|
||||
parts = append(parts, "nullzero")
|
||||
} else if column.NotNull && !column.IsPrimaryKey {
|
||||
parts = append(parts, "notnull")
|
||||
}
|
||||
|
||||
// Check for indexes (unique indexes should be added to tag)
|
||||
@@ -260,7 +269,7 @@ func (tm *TypeMapper) NeedsFmtImport(generateGetIDStr bool) bool {
|
||||
|
||||
// GetSQLTypesImport returns the import path for sql_types (ResolveSpec common)
|
||||
func (tm *TypeMapper) GetSQLTypesImport() string {
|
||||
return "github.com/bitechdev/ResolveSpec/pkg/common"
|
||||
return "github.com/bitechdev/ResolveSpec/pkg/spectypes"
|
||||
}
|
||||
|
||||
// GetBunImport returns the import path for Bun
|
||||
|
||||
@@ -207,7 +207,10 @@ func (w *Writer) writeMultiFile(db *models.Database) error {
|
||||
}
|
||||
|
||||
// Generate filename: sql_{schema}_{table}.go
|
||||
filename := fmt.Sprintf("sql_%s_%s.go", schema.Name, table.Name)
|
||||
// Sanitize schema and table names to remove quotes, comments, and invalid characters
|
||||
safeSchemaName := writers.SanitizeFilename(schema.Name)
|
||||
safeTableName := writers.SanitizeFilename(table.Name)
|
||||
filename := fmt.Sprintf("sql_%s_%s.go", safeSchemaName, safeTableName)
|
||||
filepath := filepath.Join(w.options.OutputPath, filename)
|
||||
|
||||
// Write file
|
||||
@@ -222,6 +225,9 @@ func (w *Writer) writeMultiFile(db *models.Database) error {
|
||||
|
||||
// addRelationshipFields adds relationship fields to the model based on foreign keys
|
||||
func (w *Writer) addRelationshipFields(modelData *ModelData, table *models.Table, schema *models.Schema, db *models.Database) {
|
||||
// Track used field names to detect duplicates
|
||||
usedFieldNames := make(map[string]int)
|
||||
|
||||
// For each foreign key in this table, add a belongs-to/has-one relationship
|
||||
for _, constraint := range table.Constraints {
|
||||
if constraint.Type != models.ForeignKeyConstraint {
|
||||
@@ -236,7 +242,8 @@ func (w *Writer) addRelationshipFields(modelData *ModelData, table *models.Table
|
||||
|
||||
// Create relationship field (has-one in Bun, similar to belongs-to in GORM)
|
||||
refModelName := w.getModelName(constraint.ReferencedTable)
|
||||
fieldName := w.generateRelationshipFieldName(constraint.ReferencedTable)
|
||||
fieldName := w.generateHasOneFieldName(constraint)
|
||||
fieldName = w.ensureUniqueFieldName(fieldName, usedFieldNames)
|
||||
relationTag := w.typeMapper.BuildRelationshipTag(constraint, "has-one")
|
||||
|
||||
modelData.AddRelationshipField(&FieldData{
|
||||
@@ -264,7 +271,8 @@ func (w *Writer) addRelationshipFields(modelData *ModelData, table *models.Table
|
||||
if constraint.ReferencedTable == table.Name && constraint.ReferencedSchema == schema.Name {
|
||||
// Add has-many relationship
|
||||
otherModelName := w.getModelName(otherTable.Name)
|
||||
fieldName := w.generateRelationshipFieldName(otherTable.Name) + "s" // Pluralize
|
||||
fieldName := w.generateHasManyFieldName(constraint, otherTable.Name)
|
||||
fieldName = w.ensureUniqueFieldName(fieldName, usedFieldNames)
|
||||
relationTag := w.typeMapper.BuildRelationshipTag(constraint, "has-many")
|
||||
|
||||
modelData.AddRelationshipField(&FieldData{
|
||||
@@ -307,10 +315,60 @@ func (w *Writer) getModelName(tableName string) string {
|
||||
return modelName
|
||||
}
|
||||
|
||||
// generateRelationshipFieldName generates a field name for a relationship
|
||||
func (w *Writer) generateRelationshipFieldName(tableName string) string {
|
||||
// Use just the prefix (3 letters) for relationship fields
|
||||
return GeneratePrefix(tableName)
|
||||
// generateHasOneFieldName generates a field name for has-one relationships
|
||||
// Uses the foreign key column name for uniqueness
|
||||
func (w *Writer) generateHasOneFieldName(constraint *models.Constraint) string {
|
||||
// Use the foreign key column name to ensure uniqueness
|
||||
// If there are multiple columns, use the first one
|
||||
if len(constraint.Columns) > 0 {
|
||||
columnName := constraint.Columns[0]
|
||||
// Convert to PascalCase for proper Go field naming
|
||||
// e.g., "rid_filepointer_request" -> "RelRIDFilepointerRequest"
|
||||
return "Rel" + SnakeCaseToPascalCase(columnName)
|
||||
}
|
||||
|
||||
// Fallback to table-based prefix if no columns defined
|
||||
return "Rel" + GeneratePrefix(constraint.ReferencedTable)
|
||||
}
|
||||
|
||||
// generateHasManyFieldName generates a field name for has-many relationships
|
||||
// Uses the foreign key column name + source table name to avoid duplicates
|
||||
func (w *Writer) generateHasManyFieldName(constraint *models.Constraint, sourceTableName string) string {
|
||||
// For has-many, we need to include the source table name to avoid duplicates
|
||||
// e.g., multiple tables referencing the same column on this table
|
||||
if len(constraint.Columns) > 0 {
|
||||
columnName := constraint.Columns[0]
|
||||
// Get the model name for the source table (pluralized)
|
||||
sourceModelName := w.getModelName(sourceTableName)
|
||||
// Remove "Model" prefix if present
|
||||
sourceModelName = strings.TrimPrefix(sourceModelName, "Model")
|
||||
|
||||
// Convert column to PascalCase and combine with source table
|
||||
// e.g., "rid_api_provider" + "Login" -> "RelRIDAPIProviderLogins"
|
||||
columnPart := SnakeCaseToPascalCase(columnName)
|
||||
return "Rel" + columnPart + Pluralize(sourceModelName)
|
||||
}
|
||||
|
||||
// Fallback to table-based naming
|
||||
sourceModelName := w.getModelName(sourceTableName)
|
||||
sourceModelName = strings.TrimPrefix(sourceModelName, "Model")
|
||||
return "Rel" + Pluralize(sourceModelName)
|
||||
}
|
||||
|
||||
// ensureUniqueFieldName ensures a field name is unique by adding numeric suffixes if needed
|
||||
func (w *Writer) ensureUniqueFieldName(fieldName string, usedNames map[string]int) string {
|
||||
originalName := fieldName
|
||||
count := usedNames[originalName]
|
||||
|
||||
if count > 0 {
|
||||
// Name is already used, add numeric suffix
|
||||
fieldName = fmt.Sprintf("%s%d", originalName, count+1)
|
||||
}
|
||||
|
||||
// Increment the counter for this base name
|
||||
usedNames[originalName]++
|
||||
|
||||
return fieldName
|
||||
}
|
||||
|
||||
// getPackageName returns the package name from options or defaults to "models"
|
||||
@@ -386,6 +444,7 @@ func (w *Writer) createDatabaseRef(db *models.Database) *models.Database {
|
||||
DatabaseVersion: db.DatabaseVersion,
|
||||
SourceFormat: db.SourceFormat,
|
||||
Schemas: nil, // Don't include schemas to avoid circular reference
|
||||
GUID: db.GUID,
|
||||
}
|
||||
}
|
||||
|
||||
@@ -402,5 +461,6 @@ func (w *Writer) createSchemaRef(schema *models.Schema, db *models.Database) *mo
|
||||
Sequence: schema.Sequence,
|
||||
RefDatabase: w.createDatabaseRef(db), // Include database ref
|
||||
Tables: nil, // Don't include tables to avoid circular reference
|
||||
GUID: schema.GUID,
|
||||
}
|
||||
}
|
||||
|
||||
@@ -175,12 +175,378 @@ func TestWriter_WriteDatabase_MultiFile(t *testing.T) {
|
||||
postsStr := string(postsContent)
|
||||
|
||||
// Verify relationship is present with Bun format
|
||||
if !strings.Contains(postsStr, "USE") {
|
||||
t.Errorf("Missing relationship field USE")
|
||||
// Should now be RelUserID (has-one) instead of USE
|
||||
if !strings.Contains(postsStr, "RelUserID") {
|
||||
t.Errorf("Missing relationship field RelUserID (new naming convention)")
|
||||
}
|
||||
if !strings.Contains(postsStr, "rel:has-one") {
|
||||
t.Errorf("Missing Bun relationship tag: %s", postsStr)
|
||||
}
|
||||
|
||||
// Check users file contains has-many relationship
|
||||
usersContent, err := os.ReadFile(filepath.Join(tmpDir, "sql_public_users.go"))
|
||||
if err != nil {
|
||||
t.Fatalf("Failed to read users file: %v", err)
|
||||
}
|
||||
|
||||
usersStr := string(usersContent)
|
||||
|
||||
// Should have RelUserIDPosts (has-many) field
|
||||
if !strings.Contains(usersStr, "RelUserIDPosts") {
|
||||
t.Errorf("Missing has-many relationship field RelUserIDPosts")
|
||||
}
|
||||
}
|
||||
|
||||
func TestWriter_MultipleReferencesToSameTable(t *testing.T) {
|
||||
// Test scenario: api_event table with multiple foreign keys to filepointer table
|
||||
db := models.InitDatabase("testdb")
|
||||
schema := models.InitSchema("org")
|
||||
|
||||
// Filepointer table
|
||||
filepointer := models.InitTable("filepointer", "org")
|
||||
filepointer.Columns["id_filepointer"] = &models.Column{
|
||||
Name: "id_filepointer",
|
||||
Type: "bigserial",
|
||||
NotNull: true,
|
||||
IsPrimaryKey: true,
|
||||
}
|
||||
schema.Tables = append(schema.Tables, filepointer)
|
||||
|
||||
// API event table with two foreign keys to filepointer
|
||||
apiEvent := models.InitTable("api_event", "org")
|
||||
apiEvent.Columns["id_api_event"] = &models.Column{
|
||||
Name: "id_api_event",
|
||||
Type: "bigserial",
|
||||
NotNull: true,
|
||||
IsPrimaryKey: true,
|
||||
}
|
||||
apiEvent.Columns["rid_filepointer_request"] = &models.Column{
|
||||
Name: "rid_filepointer_request",
|
||||
Type: "bigint",
|
||||
NotNull: false,
|
||||
}
|
||||
apiEvent.Columns["rid_filepointer_response"] = &models.Column{
|
||||
Name: "rid_filepointer_response",
|
||||
Type: "bigint",
|
||||
NotNull: false,
|
||||
}
|
||||
|
||||
// Add constraints
|
||||
apiEvent.Constraints["fk_request"] = &models.Constraint{
|
||||
Name: "fk_request",
|
||||
Type: models.ForeignKeyConstraint,
|
||||
Columns: []string{"rid_filepointer_request"},
|
||||
ReferencedTable: "filepointer",
|
||||
ReferencedSchema: "org",
|
||||
ReferencedColumns: []string{"id_filepointer"},
|
||||
}
|
||||
apiEvent.Constraints["fk_response"] = &models.Constraint{
|
||||
Name: "fk_response",
|
||||
Type: models.ForeignKeyConstraint,
|
||||
Columns: []string{"rid_filepointer_response"},
|
||||
ReferencedTable: "filepointer",
|
||||
ReferencedSchema: "org",
|
||||
ReferencedColumns: []string{"id_filepointer"},
|
||||
}
|
||||
|
||||
schema.Tables = append(schema.Tables, apiEvent)
|
||||
db.Schemas = append(db.Schemas, schema)
|
||||
|
||||
// Create writer
|
||||
tmpDir := t.TempDir()
|
||||
opts := &writers.WriterOptions{
|
||||
PackageName: "models",
|
||||
OutputPath: tmpDir,
|
||||
Metadata: map[string]interface{}{
|
||||
"multi_file": true,
|
||||
},
|
||||
}
|
||||
|
||||
writer := NewWriter(opts)
|
||||
err := writer.WriteDatabase(db)
|
||||
if err != nil {
|
||||
t.Fatalf("WriteDatabase failed: %v", err)
|
||||
}
|
||||
|
||||
// Read the api_event file
|
||||
apiEventContent, err := os.ReadFile(filepath.Join(tmpDir, "sql_org_api_event.go"))
|
||||
if err != nil {
|
||||
t.Fatalf("Failed to read api_event file: %v", err)
|
||||
}
|
||||
|
||||
contentStr := string(apiEventContent)
|
||||
|
||||
// Verify both relationships have unique names based on column names
|
||||
expectations := []struct {
|
||||
fieldName string
|
||||
tag string
|
||||
}{
|
||||
{"RelRIDFilepointerRequest", "join:rid_filepointer_request=id_filepointer"},
|
||||
{"RelRIDFilepointerResponse", "join:rid_filepointer_response=id_filepointer"},
|
||||
}
|
||||
|
||||
for _, exp := range expectations {
|
||||
if !strings.Contains(contentStr, exp.fieldName) {
|
||||
t.Errorf("Missing relationship field: %s\nGenerated:\n%s", exp.fieldName, contentStr)
|
||||
}
|
||||
if !strings.Contains(contentStr, exp.tag) {
|
||||
t.Errorf("Missing relationship tag: %s\nGenerated:\n%s", exp.tag, contentStr)
|
||||
}
|
||||
}
|
||||
|
||||
// Verify NO duplicate field names (old behavior would create duplicate "FIL" fields)
|
||||
if strings.Contains(contentStr, "FIL *ModelFilepointer") {
|
||||
t.Errorf("Found old prefix-based naming (FIL), should use column-based naming")
|
||||
}
|
||||
|
||||
// Also verify has-many relationships on filepointer table
|
||||
filepointerContent, err := os.ReadFile(filepath.Join(tmpDir, "sql_org_filepointer.go"))
|
||||
if err != nil {
|
||||
t.Fatalf("Failed to read filepointer file: %v", err)
|
||||
}
|
||||
|
||||
filepointerStr := string(filepointerContent)
|
||||
|
||||
// Should have two different has-many relationships with unique names
|
||||
hasManyExpectations := []string{
|
||||
"RelRIDFilepointerRequestAPIEvents", // Has many via rid_filepointer_request
|
||||
"RelRIDFilepointerResponseAPIEvents", // Has many via rid_filepointer_response
|
||||
}
|
||||
|
||||
for _, exp := range hasManyExpectations {
|
||||
if !strings.Contains(filepointerStr, exp) {
|
||||
t.Errorf("Missing has-many relationship field: %s\nGenerated:\n%s", exp, filepointerStr)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
func TestWriter_MultipleHasManyRelationships(t *testing.T) {
|
||||
// Test scenario: api_provider table referenced by multiple tables via rid_api_provider
|
||||
db := models.InitDatabase("testdb")
|
||||
schema := models.InitSchema("org")
|
||||
|
||||
// Owner table
|
||||
owner := models.InitTable("owner", "org")
|
||||
owner.Columns["id_owner"] = &models.Column{
|
||||
Name: "id_owner",
|
||||
Type: "bigserial",
|
||||
NotNull: true,
|
||||
IsPrimaryKey: true,
|
||||
}
|
||||
schema.Tables = append(schema.Tables, owner)
|
||||
|
||||
// API Provider table
|
||||
apiProvider := models.InitTable("api_provider", "org")
|
||||
apiProvider.Columns["id_api_provider"] = &models.Column{
|
||||
Name: "id_api_provider",
|
||||
Type: "bigserial",
|
||||
NotNull: true,
|
||||
IsPrimaryKey: true,
|
||||
}
|
||||
apiProvider.Columns["rid_owner"] = &models.Column{
|
||||
Name: "rid_owner",
|
||||
Type: "bigint",
|
||||
NotNull: true,
|
||||
}
|
||||
apiProvider.Constraints["fk_owner"] = &models.Constraint{
|
||||
Name: "fk_owner",
|
||||
Type: models.ForeignKeyConstraint,
|
||||
Columns: []string{"rid_owner"},
|
||||
ReferencedTable: "owner",
|
||||
ReferencedSchema: "org",
|
||||
ReferencedColumns: []string{"id_owner"},
|
||||
}
|
||||
schema.Tables = append(schema.Tables, apiProvider)
|
||||
|
||||
// Login table
|
||||
login := models.InitTable("login", "org")
|
||||
login.Columns["id_login"] = &models.Column{
|
||||
Name: "id_login",
|
||||
Type: "bigserial",
|
||||
NotNull: true,
|
||||
IsPrimaryKey: true,
|
||||
}
|
||||
login.Columns["rid_api_provider"] = &models.Column{
|
||||
Name: "rid_api_provider",
|
||||
Type: "bigint",
|
||||
NotNull: true,
|
||||
}
|
||||
login.Constraints["fk_api_provider"] = &models.Constraint{
|
||||
Name: "fk_api_provider",
|
||||
Type: models.ForeignKeyConstraint,
|
||||
Columns: []string{"rid_api_provider"},
|
||||
ReferencedTable: "api_provider",
|
||||
ReferencedSchema: "org",
|
||||
ReferencedColumns: []string{"id_api_provider"},
|
||||
}
|
||||
schema.Tables = append(schema.Tables, login)
|
||||
|
||||
// Filepointer table
|
||||
filepointer := models.InitTable("filepointer", "org")
|
||||
filepointer.Columns["id_filepointer"] = &models.Column{
|
||||
Name: "id_filepointer",
|
||||
Type: "bigserial",
|
||||
NotNull: true,
|
||||
IsPrimaryKey: true,
|
||||
}
|
||||
filepointer.Columns["rid_api_provider"] = &models.Column{
|
||||
Name: "rid_api_provider",
|
||||
Type: "bigint",
|
||||
NotNull: true,
|
||||
}
|
||||
filepointer.Constraints["fk_api_provider"] = &models.Constraint{
|
||||
Name: "fk_api_provider",
|
||||
Type: models.ForeignKeyConstraint,
|
||||
Columns: []string{"rid_api_provider"},
|
||||
ReferencedTable: "api_provider",
|
||||
ReferencedSchema: "org",
|
||||
ReferencedColumns: []string{"id_api_provider"},
|
||||
}
|
||||
schema.Tables = append(schema.Tables, filepointer)
|
||||
|
||||
// API Event table
|
||||
apiEvent := models.InitTable("api_event", "org")
|
||||
apiEvent.Columns["id_api_event"] = &models.Column{
|
||||
Name: "id_api_event",
|
||||
Type: "bigserial",
|
||||
NotNull: true,
|
||||
IsPrimaryKey: true,
|
||||
}
|
||||
apiEvent.Columns["rid_api_provider"] = &models.Column{
|
||||
Name: "rid_api_provider",
|
||||
Type: "bigint",
|
||||
NotNull: true,
|
||||
}
|
||||
apiEvent.Constraints["fk_api_provider"] = &models.Constraint{
|
||||
Name: "fk_api_provider",
|
||||
Type: models.ForeignKeyConstraint,
|
||||
Columns: []string{"rid_api_provider"},
|
||||
ReferencedTable: "api_provider",
|
||||
ReferencedSchema: "org",
|
||||
ReferencedColumns: []string{"id_api_provider"},
|
||||
}
|
||||
schema.Tables = append(schema.Tables, apiEvent)
|
||||
|
||||
db.Schemas = append(db.Schemas, schema)
|
||||
|
||||
// Create writer
|
||||
tmpDir := t.TempDir()
|
||||
opts := &writers.WriterOptions{
|
||||
PackageName: "models",
|
||||
OutputPath: tmpDir,
|
||||
Metadata: map[string]interface{}{
|
||||
"multi_file": true,
|
||||
},
|
||||
}
|
||||
|
||||
writer := NewWriter(opts)
|
||||
err := writer.WriteDatabase(db)
|
||||
if err != nil {
|
||||
t.Fatalf("WriteDatabase failed: %v", err)
|
||||
}
|
||||
|
||||
// Read the api_provider file
|
||||
apiProviderContent, err := os.ReadFile(filepath.Join(tmpDir, "sql_org_api_provider.go"))
|
||||
if err != nil {
|
||||
t.Fatalf("Failed to read api_provider file: %v", err)
|
||||
}
|
||||
|
||||
contentStr := string(apiProviderContent)
|
||||
|
||||
// Verify all has-many relationships have unique names
|
||||
hasManyExpectations := []string{
|
||||
"RelRIDAPIProviderLogins", // Has many via Login
|
||||
"RelRIDAPIProviderFilepointers", // Has many via Filepointer
|
||||
"RelRIDAPIProviderAPIEvents", // Has many via APIEvent
|
||||
"RelRIDOwner", // Has one via rid_owner
|
||||
}
|
||||
|
||||
for _, exp := range hasManyExpectations {
|
||||
if !strings.Contains(contentStr, exp) {
|
||||
t.Errorf("Missing relationship field: %s\nGenerated:\n%s", exp, contentStr)
|
||||
}
|
||||
}
|
||||
|
||||
// Verify NO duplicate field names
|
||||
// Count occurrences of "RelRIDAPIProvider" fields - should have 3 unique ones
|
||||
count := strings.Count(contentStr, "RelRIDAPIProvider")
|
||||
if count != 3 {
|
||||
t.Errorf("Expected 3 RelRIDAPIProvider* fields, found %d\nGenerated:\n%s", count, contentStr)
|
||||
}
|
||||
|
||||
// Verify no duplicate declarations (would cause compilation error)
|
||||
duplicatePattern := "RelRIDAPIProviders []*Model"
|
||||
if strings.Contains(contentStr, duplicatePattern) {
|
||||
t.Errorf("Found duplicate field declaration pattern, fields should be unique")
|
||||
}
|
||||
}
|
||||
|
||||
func TestWriter_FieldNameCollision(t *testing.T) {
|
||||
// Test scenario: table with columns that would conflict with generated method names
|
||||
table := models.InitTable("audit_table", "audit")
|
||||
table.Columns["id_audit_table"] = &models.Column{
|
||||
Name: "id_audit_table",
|
||||
Type: "smallint",
|
||||
NotNull: true,
|
||||
IsPrimaryKey: true,
|
||||
Sequence: 1,
|
||||
}
|
||||
table.Columns["table_name"] = &models.Column{
|
||||
Name: "table_name",
|
||||
Type: "varchar",
|
||||
Length: 100,
|
||||
NotNull: true,
|
||||
Sequence: 2,
|
||||
}
|
||||
table.Columns["table_schema"] = &models.Column{
|
||||
Name: "table_schema",
|
||||
Type: "varchar",
|
||||
Length: 100,
|
||||
NotNull: true,
|
||||
Sequence: 3,
|
||||
}
|
||||
|
||||
// Create writer
|
||||
tmpDir := t.TempDir()
|
||||
opts := &writers.WriterOptions{
|
||||
PackageName: "models",
|
||||
OutputPath: filepath.Join(tmpDir, "test.go"),
|
||||
}
|
||||
|
||||
writer := NewWriter(opts)
|
||||
|
||||
err := writer.WriteTable(table)
|
||||
if err != nil {
|
||||
t.Fatalf("WriteTable failed: %v", err)
|
||||
}
|
||||
|
||||
// Read the generated file
|
||||
content, err := os.ReadFile(opts.OutputPath)
|
||||
if err != nil {
|
||||
t.Fatalf("Failed to read generated file: %v", err)
|
||||
}
|
||||
|
||||
generated := string(content)
|
||||
|
||||
// Verify that TableName field was renamed to TableName_ to avoid collision
|
||||
if !strings.Contains(generated, "TableName_") {
|
||||
t.Errorf("Expected field 'TableName_' (with underscore) but not found\nGenerated:\n%s", generated)
|
||||
}
|
||||
|
||||
// Verify the struct tag still references the correct database column
|
||||
if !strings.Contains(generated, `bun:"table_name,`) {
|
||||
t.Errorf("Expected bun tag to reference 'table_name' column\nGenerated:\n%s", generated)
|
||||
}
|
||||
|
||||
// Verify the TableName() method still exists and doesn't conflict
|
||||
if !strings.Contains(generated, "func (m ModelAuditTable) TableName() string") {
|
||||
t.Errorf("TableName() method should still be generated\nGenerated:\n%s", generated)
|
||||
}
|
||||
|
||||
// Verify NO field named just "TableName" (without underscore)
|
||||
if strings.Contains(generated, "TableName resolvespec_common") || strings.Contains(generated, "TableName string") {
|
||||
t.Errorf("Field 'TableName' without underscore should not exist (would conflict with method)\nGenerated:\n%s", generated)
|
||||
}
|
||||
}
|
||||
|
||||
func TestTypeMapper_SQLTypeToGoType_Bun(t *testing.T) {
|
||||
|
||||
@@ -126,7 +126,15 @@ func (w *Writer) tableToDBML(t *models.Table) string {
|
||||
attrs = append(attrs, "increment")
|
||||
}
|
||||
if column.Default != nil {
|
||||
attrs = append(attrs, fmt.Sprintf("default: `%v`", column.Default))
|
||||
// Check if default value contains backticks (DBML expressions like `now()`)
|
||||
defaultStr := fmt.Sprintf("%v", column.Default)
|
||||
if strings.HasPrefix(defaultStr, "`") && strings.HasSuffix(defaultStr, "`") {
|
||||
// Already an expression with backticks, use as-is
|
||||
attrs = append(attrs, fmt.Sprintf("default: %s", defaultStr))
|
||||
} else {
|
||||
// Regular value, wrap in single quotes
|
||||
attrs = append(attrs, fmt.Sprintf("default: '%v'", column.Default))
|
||||
}
|
||||
}
|
||||
|
||||
if len(attrs) > 0 {
|
||||
|
||||
@@ -133,7 +133,11 @@ func (w *Writer) mapTableFields(table *models.Table) models.DCTXTable {
|
||||
prefix = table.Name[:3]
|
||||
}
|
||||
|
||||
tableGuid := w.newGUID()
|
||||
// Use GUID from model if available, otherwise generate a new one
|
||||
tableGuid := table.GUID
|
||||
if tableGuid == "" {
|
||||
tableGuid = w.newGUID()
|
||||
}
|
||||
w.tableGuidMap[table.Name] = tableGuid
|
||||
|
||||
dctxTable := models.DCTXTable{
|
||||
@@ -171,7 +175,11 @@ func (w *Writer) mapTableKeys(table *models.Table) []models.DCTXKey {
|
||||
}
|
||||
|
||||
func (w *Writer) mapField(column *models.Column) models.DCTXField {
|
||||
guid := w.newGUID()
|
||||
// Use GUID from model if available, otherwise generate a new one
|
||||
guid := column.GUID
|
||||
if guid == "" {
|
||||
guid = w.newGUID()
|
||||
}
|
||||
fieldKey := fmt.Sprintf("%s.%s", column.Table, column.Name)
|
||||
w.fieldGuidMap[fieldKey] = guid
|
||||
|
||||
@@ -209,7 +217,11 @@ func (w *Writer) mapDataType(dataType string) string {
|
||||
}
|
||||
|
||||
func (w *Writer) mapKey(index *models.Index, table *models.Table) models.DCTXKey {
|
||||
guid := w.newGUID()
|
||||
// Use GUID from model if available, otherwise generate a new one
|
||||
guid := index.GUID
|
||||
if guid == "" {
|
||||
guid = w.newGUID()
|
||||
}
|
||||
keyKey := fmt.Sprintf("%s.%s", table.Name, index.Name)
|
||||
w.keyGuidMap[keyKey] = guid
|
||||
|
||||
@@ -344,7 +356,7 @@ func (w *Writer) mapRelation(rel *models.Relationship, schema *models.Schema) mo
|
||||
}
|
||||
|
||||
return models.DCTXRelation{
|
||||
Guid: w.newGUID(),
|
||||
Guid: rel.GUID, // Use GUID from relationship model
|
||||
PrimaryTable: w.tableGuidMap[rel.ToTable], // GUID of the 'to' table (e.g., users)
|
||||
ForeignTable: w.tableGuidMap[rel.FromTable], // GUID of the 'from' table (e.g., posts)
|
||||
PrimaryKey: primaryKeyGUID,
|
||||
|
||||
@@ -127,6 +127,51 @@ func (w *Writer) databaseToDrawDB(d *models.Database) *DrawDBSchema {
|
||||
}
|
||||
}
|
||||
|
||||
// Create subject areas for domains
|
||||
for domainIdx, domainModel := range d.Domains {
|
||||
// Calculate bounds for all tables in this domain
|
||||
minX, minY := 999999, 999999
|
||||
maxX, maxY := 0, 0
|
||||
|
||||
domainTableCount := 0
|
||||
for _, domainTable := range domainModel.Tables {
|
||||
// Find the table in the schema to get its position
|
||||
for _, t := range schema.Tables {
|
||||
if t.Name == domainTable.TableName {
|
||||
if t.X < minX {
|
||||
minX = t.X
|
||||
}
|
||||
if t.Y < minY {
|
||||
minY = t.Y
|
||||
}
|
||||
if t.X+colWidth > maxX {
|
||||
maxX = t.X + colWidth
|
||||
}
|
||||
if t.Y+rowHeight > maxY {
|
||||
maxY = t.Y + rowHeight
|
||||
}
|
||||
domainTableCount++
|
||||
break
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Only create area if domain has tables in this schema
|
||||
if domainTableCount > 0 {
|
||||
area := &DrawDBArea{
|
||||
ID: areaID,
|
||||
Name: domainModel.Name,
|
||||
Color: getColorForIndex(len(d.Schemas) + domainIdx), // Use different colors than schemas
|
||||
X: minX - 20,
|
||||
Y: minY - 20,
|
||||
Width: maxX - minX + 40,
|
||||
Height: maxY - minY + 40,
|
||||
}
|
||||
schema.SubjectAreas = append(schema.SubjectAreas, area)
|
||||
areaID++
|
||||
}
|
||||
}
|
||||
|
||||
// Add relationships
|
||||
for _, schemaModel := range d.Schemas {
|
||||
for _, table := range schemaModel.Tables {
|
||||
|
||||
@@ -196,7 +196,9 @@ func (w *Writer) writeTableFile(table *models.Table, schema *models.Schema, db *
|
||||
}
|
||||
|
||||
// Generate filename: {tableName}.ts
|
||||
filename := filepath.Join(w.options.OutputPath, table.Name+".ts")
|
||||
// Sanitize table name to remove quotes, comments, and invalid characters
|
||||
safeTableName := writers.SanitizeFilename(table.Name)
|
||||
filename := filepath.Join(w.options.OutputPath, safeTableName+".ts")
|
||||
return os.WriteFile(filename, []byte(code), 0644)
|
||||
}
|
||||
|
||||
|
||||
@@ -4,6 +4,7 @@ import (
|
||||
"sort"
|
||||
|
||||
"git.warky.dev/wdevs/relspecgo/pkg/models"
|
||||
"git.warky.dev/wdevs/relspecgo/pkg/writers"
|
||||
)
|
||||
|
||||
// TemplateData represents the data passed to the template for code generation
|
||||
@@ -24,6 +25,7 @@ type ModelData struct {
|
||||
Fields []*FieldData
|
||||
Config *MethodConfig
|
||||
PrimaryKeyField string // Name of the primary key field
|
||||
PrimaryKeyType string // Go type of the primary key field
|
||||
IDColumnName string // Name of the ID column in database
|
||||
Prefix string // 3-letter prefix
|
||||
}
|
||||
@@ -131,8 +133,11 @@ func NewModelData(table *models.Table, schema string, typeMapper *TypeMapper) *M
|
||||
// Find primary key
|
||||
for _, col := range table.Columns {
|
||||
if col.IsPrimaryKey {
|
||||
model.PrimaryKeyField = SnakeCaseToPascalCase(col.Name)
|
||||
model.IDColumnName = col.Name
|
||||
// Sanitize column name to remove backticks
|
||||
safeName := writers.SanitizeStructTagValue(col.Name)
|
||||
model.PrimaryKeyField = SnakeCaseToPascalCase(safeName)
|
||||
model.PrimaryKeyType = typeMapper.SQLTypeToGoType(col.Type, col.NotNull)
|
||||
model.IDColumnName = safeName
|
||||
break
|
||||
}
|
||||
}
|
||||
@@ -141,6 +146,8 @@ func NewModelData(table *models.Table, schema string, typeMapper *TypeMapper) *M
|
||||
columns := sortColumns(table.Columns)
|
||||
for _, col := range columns {
|
||||
field := columnToField(col, table, typeMapper)
|
||||
// Check for name collision with generated methods and rename if needed
|
||||
field.Name = resolveFieldNameCollision(field.Name)
|
||||
model.Fields = append(model.Fields, field)
|
||||
}
|
||||
|
||||
@@ -149,10 +156,13 @@ func NewModelData(table *models.Table, schema string, typeMapper *TypeMapper) *M
|
||||
|
||||
// columnToField converts a models.Column to FieldData
|
||||
func columnToField(col *models.Column, table *models.Table, typeMapper *TypeMapper) *FieldData {
|
||||
fieldName := SnakeCaseToPascalCase(col.Name)
|
||||
// Sanitize column name first to remove backticks before generating field name
|
||||
safeName := writers.SanitizeStructTagValue(col.Name)
|
||||
fieldName := SnakeCaseToPascalCase(safeName)
|
||||
goType := typeMapper.SQLTypeToGoType(col.Type, col.NotNull)
|
||||
gormTag := typeMapper.BuildGormTag(col, table)
|
||||
jsonTag := col.Name // Use column name for JSON tag
|
||||
// Use same sanitized name for JSON tag
|
||||
jsonTag := safeName
|
||||
|
||||
return &FieldData{
|
||||
Name: fieldName,
|
||||
@@ -184,6 +194,30 @@ func hasModelPrefix(name string) bool {
|
||||
return len(name) >= 5 && name[:5] == "Model"
|
||||
}
|
||||
|
||||
// resolveFieldNameCollision checks if a field name conflicts with generated method names
|
||||
// and adds an underscore suffix if there's a collision
|
||||
func resolveFieldNameCollision(fieldName string) string {
|
||||
// List of method names that are generated by the template
|
||||
reservedNames := map[string]bool{
|
||||
"TableName": true,
|
||||
"TableNameOnly": true,
|
||||
"SchemaName": true,
|
||||
"GetID": true,
|
||||
"GetIDStr": true,
|
||||
"SetID": true,
|
||||
"UpdateID": true,
|
||||
"GetIDName": true,
|
||||
"GetPrefix": true,
|
||||
}
|
||||
|
||||
// Check if field name conflicts with a reserved method name
|
||||
if reservedNames[fieldName] {
|
||||
return fieldName + "_"
|
||||
}
|
||||
|
||||
return fieldName
|
||||
}
|
||||
|
||||
// sortColumns sorts columns by sequence, then by name
|
||||
func sortColumns(columns map[string]*models.Column) []*models.Column {
|
||||
result := make([]*models.Column, 0, len(columns))
|
||||
|
||||
@@ -62,7 +62,7 @@ func (m {{.Name}}) SetID(newid int64) {
|
||||
{{if and .Config.GenerateUpdateID .PrimaryKeyField}}
|
||||
// UpdateID updates the primary key value
|
||||
func (m *{{.Name}}) UpdateID(newid int64) {
|
||||
m.{{.PrimaryKeyField}} = int32(newid)
|
||||
m.{{.PrimaryKeyField}} = {{.PrimaryKeyType}}(newid)
|
||||
}
|
||||
{{end}}
|
||||
{{if and .Config.GenerateGetIDName .IDColumnName}}
|
||||
|
||||
@@ -5,6 +5,7 @@ import (
|
||||
"strings"
|
||||
|
||||
"git.warky.dev/wdevs/relspecgo/pkg/models"
|
||||
"git.warky.dev/wdevs/relspecgo/pkg/writers"
|
||||
)
|
||||
|
||||
// TypeMapper handles type conversions between SQL and Go types
|
||||
@@ -199,12 +200,15 @@ func (tm *TypeMapper) BuildGormTag(column *models.Column, table *models.Table) s
|
||||
var parts []string
|
||||
|
||||
// Always include column name (lowercase as per user requirement)
|
||||
parts = append(parts, fmt.Sprintf("column:%s", column.Name))
|
||||
// Sanitize to remove backticks which would break struct tag syntax
|
||||
safeName := writers.SanitizeStructTagValue(column.Name)
|
||||
parts = append(parts, fmt.Sprintf("column:%s", safeName))
|
||||
|
||||
// Add type if specified
|
||||
if column.Type != "" {
|
||||
// Include length, precision, scale if present
|
||||
typeStr := column.Type
|
||||
// Sanitize type to remove backticks
|
||||
typeStr := writers.SanitizeStructTagValue(column.Type)
|
||||
if column.Length > 0 {
|
||||
typeStr = fmt.Sprintf("%s(%d)", typeStr, column.Length)
|
||||
} else if column.Precision > 0 {
|
||||
@@ -234,7 +238,9 @@ func (tm *TypeMapper) BuildGormTag(column *models.Column, table *models.Table) s
|
||||
|
||||
// Default value
|
||||
if column.Default != nil {
|
||||
parts = append(parts, fmt.Sprintf("default:%v", column.Default))
|
||||
// Sanitize default value to remove backticks
|
||||
safeDefault := writers.SanitizeStructTagValue(fmt.Sprintf("%v", column.Default))
|
||||
parts = append(parts, fmt.Sprintf("default:%s", safeDefault))
|
||||
}
|
||||
|
||||
// Check for unique constraint
|
||||
@@ -331,5 +337,5 @@ func (tm *TypeMapper) NeedsFmtImport(generateGetIDStr bool) bool {
|
||||
|
||||
// GetSQLTypesImport returns the import path for sql_types
|
||||
func (tm *TypeMapper) GetSQLTypesImport() string {
|
||||
return "github.com/bitechdev/ResolveSpec/pkg/common/sql_types"
|
||||
return "github.com/bitechdev/ResolveSpec/pkg/spectypes"
|
||||
}
|
||||
|
||||
@@ -201,7 +201,10 @@ func (w *Writer) writeMultiFile(db *models.Database) error {
|
||||
}
|
||||
|
||||
// Generate filename: sql_{schema}_{table}.go
|
||||
filename := fmt.Sprintf("sql_%s_%s.go", schema.Name, table.Name)
|
||||
// Sanitize schema and table names to remove quotes, comments, and invalid characters
|
||||
safeSchemaName := writers.SanitizeFilename(schema.Name)
|
||||
safeTableName := writers.SanitizeFilename(table.Name)
|
||||
filename := fmt.Sprintf("sql_%s_%s.go", safeSchemaName, safeTableName)
|
||||
filepath := filepath.Join(w.options.OutputPath, filename)
|
||||
|
||||
// Write file
|
||||
@@ -216,6 +219,9 @@ func (w *Writer) writeMultiFile(db *models.Database) error {
|
||||
|
||||
// addRelationshipFields adds relationship fields to the model based on foreign keys
|
||||
func (w *Writer) addRelationshipFields(modelData *ModelData, table *models.Table, schema *models.Schema, db *models.Database) {
|
||||
// Track used field names to detect duplicates
|
||||
usedFieldNames := make(map[string]int)
|
||||
|
||||
// For each foreign key in this table, add a belongs-to relationship
|
||||
for _, constraint := range table.Constraints {
|
||||
if constraint.Type != models.ForeignKeyConstraint {
|
||||
@@ -230,7 +236,8 @@ func (w *Writer) addRelationshipFields(modelData *ModelData, table *models.Table
|
||||
|
||||
// Create relationship field (belongs-to)
|
||||
refModelName := w.getModelName(constraint.ReferencedTable)
|
||||
fieldName := w.generateRelationshipFieldName(constraint.ReferencedTable)
|
||||
fieldName := w.generateBelongsToFieldName(constraint)
|
||||
fieldName = w.ensureUniqueFieldName(fieldName, usedFieldNames)
|
||||
relationTag := w.typeMapper.BuildRelationshipTag(constraint, false)
|
||||
|
||||
modelData.AddRelationshipField(&FieldData{
|
||||
@@ -258,7 +265,8 @@ func (w *Writer) addRelationshipFields(modelData *ModelData, table *models.Table
|
||||
if constraint.ReferencedTable == table.Name && constraint.ReferencedSchema == schema.Name {
|
||||
// Add has-many relationship
|
||||
otherModelName := w.getModelName(otherTable.Name)
|
||||
fieldName := w.generateRelationshipFieldName(otherTable.Name) + "s" // Pluralize
|
||||
fieldName := w.generateHasManyFieldName(constraint, otherTable.Name)
|
||||
fieldName = w.ensureUniqueFieldName(fieldName, usedFieldNames)
|
||||
relationTag := w.typeMapper.BuildRelationshipTag(constraint, true)
|
||||
|
||||
modelData.AddRelationshipField(&FieldData{
|
||||
@@ -301,10 +309,60 @@ func (w *Writer) getModelName(tableName string) string {
|
||||
return modelName
|
||||
}
|
||||
|
||||
// generateRelationshipFieldName generates a field name for a relationship
|
||||
func (w *Writer) generateRelationshipFieldName(tableName string) string {
|
||||
// Use just the prefix (3 letters) for relationship fields
|
||||
return GeneratePrefix(tableName)
|
||||
// generateBelongsToFieldName generates a field name for belongs-to relationships
|
||||
// Uses the foreign key column name for uniqueness
|
||||
func (w *Writer) generateBelongsToFieldName(constraint *models.Constraint) string {
|
||||
// Use the foreign key column name to ensure uniqueness
|
||||
// If there are multiple columns, use the first one
|
||||
if len(constraint.Columns) > 0 {
|
||||
columnName := constraint.Columns[0]
|
||||
// Convert to PascalCase for proper Go field naming
|
||||
// e.g., "rid_filepointer_request" -> "RelRIDFilepointerRequest"
|
||||
return "Rel" + SnakeCaseToPascalCase(columnName)
|
||||
}
|
||||
|
||||
// Fallback to table-based prefix if no columns defined
|
||||
return "Rel" + GeneratePrefix(constraint.ReferencedTable)
|
||||
}
|
||||
|
||||
// generateHasManyFieldName generates a field name for has-many relationships
|
||||
// Uses the foreign key column name + source table name to avoid duplicates
|
||||
func (w *Writer) generateHasManyFieldName(constraint *models.Constraint, sourceTableName string) string {
|
||||
// For has-many, we need to include the source table name to avoid duplicates
|
||||
// e.g., multiple tables referencing the same column on this table
|
||||
if len(constraint.Columns) > 0 {
|
||||
columnName := constraint.Columns[0]
|
||||
// Get the model name for the source table (pluralized)
|
||||
sourceModelName := w.getModelName(sourceTableName)
|
||||
// Remove "Model" prefix if present
|
||||
sourceModelName = strings.TrimPrefix(sourceModelName, "Model")
|
||||
|
||||
// Convert column to PascalCase and combine with source table
|
||||
// e.g., "rid_api_provider" + "Login" -> "RelRIDAPIProviderLogins"
|
||||
columnPart := SnakeCaseToPascalCase(columnName)
|
||||
return "Rel" + columnPart + Pluralize(sourceModelName)
|
||||
}
|
||||
|
||||
// Fallback to table-based naming
|
||||
sourceModelName := w.getModelName(sourceTableName)
|
||||
sourceModelName = strings.TrimPrefix(sourceModelName, "Model")
|
||||
return "Rel" + Pluralize(sourceModelName)
|
||||
}
|
||||
|
||||
// ensureUniqueFieldName ensures a field name is unique by adding numeric suffixes if needed
|
||||
func (w *Writer) ensureUniqueFieldName(fieldName string, usedNames map[string]int) string {
|
||||
originalName := fieldName
|
||||
count := usedNames[originalName]
|
||||
|
||||
if count > 0 {
|
||||
// Name is already used, add numeric suffix
|
||||
fieldName = fmt.Sprintf("%s%d", originalName, count+1)
|
||||
}
|
||||
|
||||
// Increment the counter for this base name
|
||||
usedNames[originalName]++
|
||||
|
||||
return fieldName
|
||||
}
|
||||
|
||||
// getPackageName returns the package name from options or defaults to "models"
|
||||
@@ -380,6 +438,7 @@ func (w *Writer) createDatabaseRef(db *models.Database) *models.Database {
|
||||
DatabaseVersion: db.DatabaseVersion,
|
||||
SourceFormat: db.SourceFormat,
|
||||
Schemas: nil, // Don't include schemas to avoid circular reference
|
||||
GUID: db.GUID,
|
||||
}
|
||||
}
|
||||
|
||||
@@ -396,5 +455,6 @@ func (w *Writer) createSchemaRef(schema *models.Schema, db *models.Database) *mo
|
||||
Sequence: schema.Sequence,
|
||||
RefDatabase: w.createDatabaseRef(db), // Include database ref
|
||||
Tables: nil, // Don't include tables to avoid circular reference
|
||||
GUID: schema.GUID,
|
||||
}
|
||||
}
|
||||
|
||||
@@ -164,9 +164,437 @@ func TestWriter_WriteDatabase_MultiFile(t *testing.T) {
|
||||
t.Fatalf("Failed to read posts file: %v", err)
|
||||
}
|
||||
|
||||
if !strings.Contains(string(postsContent), "USE *ModelUser") {
|
||||
// Relationship field should be present
|
||||
t.Logf("Posts content:\n%s", string(postsContent))
|
||||
postsStr := string(postsContent)
|
||||
|
||||
// Verify relationship is present with new naming convention
|
||||
// Should now be RelUserID (belongs-to) instead of USE
|
||||
if !strings.Contains(postsStr, "RelUserID") {
|
||||
t.Errorf("Missing relationship field RelUserID (new naming convention)")
|
||||
}
|
||||
|
||||
// Check users file contains has-many relationship
|
||||
usersContent, err := os.ReadFile(filepath.Join(tmpDir, "sql_public_users.go"))
|
||||
if err != nil {
|
||||
t.Fatalf("Failed to read users file: %v", err)
|
||||
}
|
||||
|
||||
usersStr := string(usersContent)
|
||||
|
||||
// Should have RelUserIDPosts (has-many) field
|
||||
if !strings.Contains(usersStr, "RelUserIDPosts") {
|
||||
t.Errorf("Missing has-many relationship field RelUserIDPosts")
|
||||
}
|
||||
}
|
||||
|
||||
func TestWriter_MultipleReferencesToSameTable(t *testing.T) {
|
||||
// Test scenario: api_event table with multiple foreign keys to filepointer table
|
||||
db := models.InitDatabase("testdb")
|
||||
schema := models.InitSchema("org")
|
||||
|
||||
// Filepointer table
|
||||
filepointer := models.InitTable("filepointer", "org")
|
||||
filepointer.Columns["id_filepointer"] = &models.Column{
|
||||
Name: "id_filepointer",
|
||||
Type: "bigserial",
|
||||
NotNull: true,
|
||||
IsPrimaryKey: true,
|
||||
}
|
||||
schema.Tables = append(schema.Tables, filepointer)
|
||||
|
||||
// API event table with two foreign keys to filepointer
|
||||
apiEvent := models.InitTable("api_event", "org")
|
||||
apiEvent.Columns["id_api_event"] = &models.Column{
|
||||
Name: "id_api_event",
|
||||
Type: "bigserial",
|
||||
NotNull: true,
|
||||
IsPrimaryKey: true,
|
||||
}
|
||||
apiEvent.Columns["rid_filepointer_request"] = &models.Column{
|
||||
Name: "rid_filepointer_request",
|
||||
Type: "bigint",
|
||||
NotNull: false,
|
||||
}
|
||||
apiEvent.Columns["rid_filepointer_response"] = &models.Column{
|
||||
Name: "rid_filepointer_response",
|
||||
Type: "bigint",
|
||||
NotNull: false,
|
||||
}
|
||||
|
||||
// Add constraints
|
||||
apiEvent.Constraints["fk_request"] = &models.Constraint{
|
||||
Name: "fk_request",
|
||||
Type: models.ForeignKeyConstraint,
|
||||
Columns: []string{"rid_filepointer_request"},
|
||||
ReferencedTable: "filepointer",
|
||||
ReferencedSchema: "org",
|
||||
ReferencedColumns: []string{"id_filepointer"},
|
||||
}
|
||||
apiEvent.Constraints["fk_response"] = &models.Constraint{
|
||||
Name: "fk_response",
|
||||
Type: models.ForeignKeyConstraint,
|
||||
Columns: []string{"rid_filepointer_response"},
|
||||
ReferencedTable: "filepointer",
|
||||
ReferencedSchema: "org",
|
||||
ReferencedColumns: []string{"id_filepointer"},
|
||||
}
|
||||
|
||||
schema.Tables = append(schema.Tables, apiEvent)
|
||||
db.Schemas = append(db.Schemas, schema)
|
||||
|
||||
// Create writer
|
||||
tmpDir := t.TempDir()
|
||||
opts := &writers.WriterOptions{
|
||||
PackageName: "models",
|
||||
OutputPath: tmpDir,
|
||||
Metadata: map[string]interface{}{
|
||||
"multi_file": true,
|
||||
},
|
||||
}
|
||||
|
||||
writer := NewWriter(opts)
|
||||
err := writer.WriteDatabase(db)
|
||||
if err != nil {
|
||||
t.Fatalf("WriteDatabase failed: %v", err)
|
||||
}
|
||||
|
||||
// Read the api_event file
|
||||
apiEventContent, err := os.ReadFile(filepath.Join(tmpDir, "sql_org_api_event.go"))
|
||||
if err != nil {
|
||||
t.Fatalf("Failed to read api_event file: %v", err)
|
||||
}
|
||||
|
||||
contentStr := string(apiEventContent)
|
||||
|
||||
// Verify both relationships have unique names based on column names
|
||||
expectations := []struct {
|
||||
fieldName string
|
||||
tag string
|
||||
}{
|
||||
{"RelRIDFilepointerRequest", "foreignKey:RIDFilepointerRequest"},
|
||||
{"RelRIDFilepointerResponse", "foreignKey:RIDFilepointerResponse"},
|
||||
}
|
||||
|
||||
for _, exp := range expectations {
|
||||
if !strings.Contains(contentStr, exp.fieldName) {
|
||||
t.Errorf("Missing relationship field: %s\nGenerated:\n%s", exp.fieldName, contentStr)
|
||||
}
|
||||
if !strings.Contains(contentStr, exp.tag) {
|
||||
t.Errorf("Missing relationship tag: %s\nGenerated:\n%s", exp.tag, contentStr)
|
||||
}
|
||||
}
|
||||
|
||||
// Verify NO duplicate field names (old behavior would create duplicate "FIL" fields)
|
||||
if strings.Contains(contentStr, "FIL *ModelFilepointer") {
|
||||
t.Errorf("Found old prefix-based naming (FIL), should use column-based naming")
|
||||
}
|
||||
|
||||
// Also verify has-many relationships on filepointer table
|
||||
filepointerContent, err := os.ReadFile(filepath.Join(tmpDir, "sql_org_filepointer.go"))
|
||||
if err != nil {
|
||||
t.Fatalf("Failed to read filepointer file: %v", err)
|
||||
}
|
||||
|
||||
filepointerStr := string(filepointerContent)
|
||||
|
||||
// Should have two different has-many relationships with unique names
|
||||
hasManyExpectations := []string{
|
||||
"RelRIDFilepointerRequestAPIEvents", // Has many via rid_filepointer_request
|
||||
"RelRIDFilepointerResponseAPIEvents", // Has many via rid_filepointer_response
|
||||
}
|
||||
|
||||
for _, exp := range hasManyExpectations {
|
||||
if !strings.Contains(filepointerStr, exp) {
|
||||
t.Errorf("Missing has-many relationship field: %s\nGenerated:\n%s", exp, filepointerStr)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
func TestWriter_MultipleHasManyRelationships(t *testing.T) {
|
||||
// Test scenario: api_provider table referenced by multiple tables via rid_api_provider
|
||||
db := models.InitDatabase("testdb")
|
||||
schema := models.InitSchema("org")
|
||||
|
||||
// Owner table
|
||||
owner := models.InitTable("owner", "org")
|
||||
owner.Columns["id_owner"] = &models.Column{
|
||||
Name: "id_owner",
|
||||
Type: "bigserial",
|
||||
NotNull: true,
|
||||
IsPrimaryKey: true,
|
||||
}
|
||||
schema.Tables = append(schema.Tables, owner)
|
||||
|
||||
// API Provider table
|
||||
apiProvider := models.InitTable("api_provider", "org")
|
||||
apiProvider.Columns["id_api_provider"] = &models.Column{
|
||||
Name: "id_api_provider",
|
||||
Type: "bigserial",
|
||||
NotNull: true,
|
||||
IsPrimaryKey: true,
|
||||
}
|
||||
apiProvider.Columns["rid_owner"] = &models.Column{
|
||||
Name: "rid_owner",
|
||||
Type: "bigint",
|
||||
NotNull: true,
|
||||
}
|
||||
apiProvider.Constraints["fk_owner"] = &models.Constraint{
|
||||
Name: "fk_owner",
|
||||
Type: models.ForeignKeyConstraint,
|
||||
Columns: []string{"rid_owner"},
|
||||
ReferencedTable: "owner",
|
||||
ReferencedSchema: "org",
|
||||
ReferencedColumns: []string{"id_owner"},
|
||||
}
|
||||
schema.Tables = append(schema.Tables, apiProvider)
|
||||
|
||||
// Login table
|
||||
login := models.InitTable("login", "org")
|
||||
login.Columns["id_login"] = &models.Column{
|
||||
Name: "id_login",
|
||||
Type: "bigserial",
|
||||
NotNull: true,
|
||||
IsPrimaryKey: true,
|
||||
}
|
||||
login.Columns["rid_api_provider"] = &models.Column{
|
||||
Name: "rid_api_provider",
|
||||
Type: "bigint",
|
||||
NotNull: true,
|
||||
}
|
||||
login.Constraints["fk_api_provider"] = &models.Constraint{
|
||||
Name: "fk_api_provider",
|
||||
Type: models.ForeignKeyConstraint,
|
||||
Columns: []string{"rid_api_provider"},
|
||||
ReferencedTable: "api_provider",
|
||||
ReferencedSchema: "org",
|
||||
ReferencedColumns: []string{"id_api_provider"},
|
||||
}
|
||||
schema.Tables = append(schema.Tables, login)
|
||||
|
||||
// Filepointer table
|
||||
filepointer := models.InitTable("filepointer", "org")
|
||||
filepointer.Columns["id_filepointer"] = &models.Column{
|
||||
Name: "id_filepointer",
|
||||
Type: "bigserial",
|
||||
NotNull: true,
|
||||
IsPrimaryKey: true,
|
||||
}
|
||||
filepointer.Columns["rid_api_provider"] = &models.Column{
|
||||
Name: "rid_api_provider",
|
||||
Type: "bigint",
|
||||
NotNull: true,
|
||||
}
|
||||
filepointer.Constraints["fk_api_provider"] = &models.Constraint{
|
||||
Name: "fk_api_provider",
|
||||
Type: models.ForeignKeyConstraint,
|
||||
Columns: []string{"rid_api_provider"},
|
||||
ReferencedTable: "api_provider",
|
||||
ReferencedSchema: "org",
|
||||
ReferencedColumns: []string{"id_api_provider"},
|
||||
}
|
||||
schema.Tables = append(schema.Tables, filepointer)
|
||||
|
||||
// API Event table
|
||||
apiEvent := models.InitTable("api_event", "org")
|
||||
apiEvent.Columns["id_api_event"] = &models.Column{
|
||||
Name: "id_api_event",
|
||||
Type: "bigserial",
|
||||
NotNull: true,
|
||||
IsPrimaryKey: true,
|
||||
}
|
||||
apiEvent.Columns["rid_api_provider"] = &models.Column{
|
||||
Name: "rid_api_provider",
|
||||
Type: "bigint",
|
||||
NotNull: true,
|
||||
}
|
||||
apiEvent.Constraints["fk_api_provider"] = &models.Constraint{
|
||||
Name: "fk_api_provider",
|
||||
Type: models.ForeignKeyConstraint,
|
||||
Columns: []string{"rid_api_provider"},
|
||||
ReferencedTable: "api_provider",
|
||||
ReferencedSchema: "org",
|
||||
ReferencedColumns: []string{"id_api_provider"},
|
||||
}
|
||||
schema.Tables = append(schema.Tables, apiEvent)
|
||||
|
||||
db.Schemas = append(db.Schemas, schema)
|
||||
|
||||
// Create writer
|
||||
tmpDir := t.TempDir()
|
||||
opts := &writers.WriterOptions{
|
||||
PackageName: "models",
|
||||
OutputPath: tmpDir,
|
||||
Metadata: map[string]interface{}{
|
||||
"multi_file": true,
|
||||
},
|
||||
}
|
||||
|
||||
writer := NewWriter(opts)
|
||||
err := writer.WriteDatabase(db)
|
||||
if err != nil {
|
||||
t.Fatalf("WriteDatabase failed: %v", err)
|
||||
}
|
||||
|
||||
// Read the api_provider file
|
||||
apiProviderContent, err := os.ReadFile(filepath.Join(tmpDir, "sql_org_api_provider.go"))
|
||||
if err != nil {
|
||||
t.Fatalf("Failed to read api_provider file: %v", err)
|
||||
}
|
||||
|
||||
contentStr := string(apiProviderContent)
|
||||
|
||||
// Verify all has-many relationships have unique names
|
||||
hasManyExpectations := []string{
|
||||
"RelRIDAPIProviderLogins", // Has many via Login
|
||||
"RelRIDAPIProviderFilepointers", // Has many via Filepointer
|
||||
"RelRIDAPIProviderAPIEvents", // Has many via APIEvent
|
||||
"RelRIDOwner", // Belongs to via rid_owner
|
||||
}
|
||||
|
||||
for _, exp := range hasManyExpectations {
|
||||
if !strings.Contains(contentStr, exp) {
|
||||
t.Errorf("Missing relationship field: %s\nGenerated:\n%s", exp, contentStr)
|
||||
}
|
||||
}
|
||||
|
||||
// Verify NO duplicate field names
|
||||
// Count occurrences of "RelRIDAPIProvider" fields - should have 3 unique ones
|
||||
count := strings.Count(contentStr, "RelRIDAPIProvider")
|
||||
if count != 3 {
|
||||
t.Errorf("Expected 3 RelRIDAPIProvider* fields, found %d\nGenerated:\n%s", count, contentStr)
|
||||
}
|
||||
|
||||
// Verify no duplicate declarations (would cause compilation error)
|
||||
duplicatePattern := "RelRIDAPIProviders []*Model"
|
||||
if strings.Contains(contentStr, duplicatePattern) {
|
||||
t.Errorf("Found duplicate field declaration pattern, fields should be unique")
|
||||
}
|
||||
}
|
||||
|
||||
func TestWriter_FieldNameCollision(t *testing.T) {
|
||||
// Test scenario: table with columns that would conflict with generated method names
|
||||
table := models.InitTable("audit_table", "audit")
|
||||
table.Columns["id_audit_table"] = &models.Column{
|
||||
Name: "id_audit_table",
|
||||
Type: "smallint",
|
||||
NotNull: true,
|
||||
IsPrimaryKey: true,
|
||||
Sequence: 1,
|
||||
}
|
||||
table.Columns["table_name"] = &models.Column{
|
||||
Name: "table_name",
|
||||
Type: "varchar",
|
||||
Length: 100,
|
||||
NotNull: true,
|
||||
Sequence: 2,
|
||||
}
|
||||
table.Columns["table_schema"] = &models.Column{
|
||||
Name: "table_schema",
|
||||
Type: "varchar",
|
||||
Length: 100,
|
||||
NotNull: true,
|
||||
Sequence: 3,
|
||||
}
|
||||
|
||||
// Create writer
|
||||
tmpDir := t.TempDir()
|
||||
opts := &writers.WriterOptions{
|
||||
PackageName: "models",
|
||||
OutputPath: filepath.Join(tmpDir, "test.go"),
|
||||
}
|
||||
|
||||
writer := NewWriter(opts)
|
||||
|
||||
err := writer.WriteTable(table)
|
||||
if err != nil {
|
||||
t.Fatalf("WriteTable failed: %v", err)
|
||||
}
|
||||
|
||||
// Read the generated file
|
||||
content, err := os.ReadFile(opts.OutputPath)
|
||||
if err != nil {
|
||||
t.Fatalf("Failed to read generated file: %v", err)
|
||||
}
|
||||
|
||||
generated := string(content)
|
||||
|
||||
// Verify that TableName field was renamed to TableName_ to avoid collision
|
||||
if !strings.Contains(generated, "TableName_") {
|
||||
t.Errorf("Expected field 'TableName_' (with underscore) but not found\nGenerated:\n%s", generated)
|
||||
}
|
||||
|
||||
// Verify the struct tag still references the correct database column
|
||||
if !strings.Contains(generated, `gorm:"column:table_name;`) {
|
||||
t.Errorf("Expected gorm tag to reference 'table_name' column\nGenerated:\n%s", generated)
|
||||
}
|
||||
|
||||
// Verify the TableName() method still exists and doesn't conflict
|
||||
if !strings.Contains(generated, "func (m ModelAuditTable) TableName() string") {
|
||||
t.Errorf("TableName() method should still be generated\nGenerated:\n%s", generated)
|
||||
}
|
||||
|
||||
// Verify NO field named just "TableName" (without underscore)
|
||||
if strings.Contains(generated, "TableName sql_types") || strings.Contains(generated, "TableName string") {
|
||||
t.Errorf("Field 'TableName' without underscore should not exist (would conflict with method)\nGenerated:\n%s", generated)
|
||||
}
|
||||
}
|
||||
|
||||
func TestWriter_UpdateIDTypeSafety(t *testing.T) {
|
||||
// Test scenario: tables with different primary key types
|
||||
tests := []struct {
|
||||
name string
|
||||
pkType string
|
||||
expectedPK string
|
||||
castType string
|
||||
}{
|
||||
{"int32_pk", "int", "int32", "int32(newid)"},
|
||||
{"int16_pk", "smallint", "int16", "int16(newid)"},
|
||||
{"int64_pk", "bigint", "int64", "int64(newid)"},
|
||||
}
|
||||
|
||||
for _, tt := range tests {
|
||||
t.Run(tt.name, func(t *testing.T) {
|
||||
table := models.InitTable("test_table", "public")
|
||||
table.Columns["id"] = &models.Column{
|
||||
Name: "id",
|
||||
Type: tt.pkType,
|
||||
NotNull: true,
|
||||
IsPrimaryKey: true,
|
||||
}
|
||||
|
||||
tmpDir := t.TempDir()
|
||||
opts := &writers.WriterOptions{
|
||||
PackageName: "models",
|
||||
OutputPath: filepath.Join(tmpDir, "test.go"),
|
||||
}
|
||||
|
||||
writer := NewWriter(opts)
|
||||
err := writer.WriteTable(table)
|
||||
if err != nil {
|
||||
t.Fatalf("WriteTable failed: %v", err)
|
||||
}
|
||||
|
||||
content, err := os.ReadFile(opts.OutputPath)
|
||||
if err != nil {
|
||||
t.Fatalf("Failed to read generated file: %v", err)
|
||||
}
|
||||
|
||||
generated := string(content)
|
||||
|
||||
// Verify UpdateID method has correct type cast
|
||||
if !strings.Contains(generated, tt.castType) {
|
||||
t.Errorf("Expected UpdateID to cast to %s\nGenerated:\n%s", tt.castType, generated)
|
||||
}
|
||||
|
||||
// Verify no invalid int32(newid) for non-int32 types
|
||||
if tt.expectedPK != "int32" && strings.Contains(generated, "int32(newid)") {
|
||||
t.Errorf("UpdateID should not cast to int32 for %s type\nGenerated:\n%s", tt.pkType, generated)
|
||||
}
|
||||
|
||||
// Verify UpdateID parameter is int64 (for consistency)
|
||||
if !strings.Contains(generated, "UpdateID(newid int64)") {
|
||||
t.Errorf("UpdateID should accept int64 parameter\nGenerated:\n%s", generated)
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@@ -7,6 +7,7 @@ type TemplateData struct {
|
||||
// One of these will be populated based on execution mode
|
||||
Database *models.Database
|
||||
Schema *models.Schema
|
||||
Domain *models.Domain
|
||||
Script *models.Script
|
||||
Table *models.Table
|
||||
|
||||
@@ -57,6 +58,15 @@ func NewSchemaData(schema *models.Schema, metadata map[string]interface{}) *Temp
|
||||
}
|
||||
}
|
||||
|
||||
// NewDomainData creates template data for domain mode
|
||||
func NewDomainData(domain *models.Domain, db *models.Database, metadata map[string]interface{}) *TemplateData {
|
||||
return &TemplateData{
|
||||
Domain: domain,
|
||||
ParentDatabase: db,
|
||||
Metadata: metadata,
|
||||
}
|
||||
}
|
||||
|
||||
// NewScriptData creates template data for script mode
|
||||
func NewScriptData(script *models.Script, schema *models.Schema, db *models.Database, metadata map[string]interface{}) *TemplateData {
|
||||
return &TemplateData{
|
||||
@@ -85,6 +95,9 @@ func (td *TemplateData) Name() string {
|
||||
if td.Schema != nil {
|
||||
return td.Schema.Name
|
||||
}
|
||||
if td.Domain != nil {
|
||||
return td.Domain.Name
|
||||
}
|
||||
if td.Script != nil {
|
||||
return td.Script.Name
|
||||
}
|
||||
|
||||
@@ -20,6 +20,8 @@ const (
|
||||
DatabaseMode EntrypointMode = "database"
|
||||
// SchemaMode executes the template once per schema (multi-file output)
|
||||
SchemaMode EntrypointMode = "schema"
|
||||
// DomainMode executes the template once per domain (multi-file output)
|
||||
DomainMode EntrypointMode = "domain"
|
||||
// ScriptMode executes the template once per script (multi-file output)
|
||||
ScriptMode EntrypointMode = "script"
|
||||
// TableMode executes the template once per table (multi-file output)
|
||||
@@ -80,6 +82,8 @@ func (w *Writer) WriteDatabase(db *models.Database) error {
|
||||
return w.executeDatabaseMode(db)
|
||||
case SchemaMode:
|
||||
return w.executeSchemaMode(db)
|
||||
case DomainMode:
|
||||
return w.executeDomainMode(db)
|
||||
case ScriptMode:
|
||||
return w.executeScriptMode(db)
|
||||
case TableMode:
|
||||
@@ -143,6 +147,28 @@ func (w *Writer) executeSchemaMode(db *models.Database) error {
|
||||
return nil
|
||||
}
|
||||
|
||||
// executeDomainMode executes the template once per domain
|
||||
func (w *Writer) executeDomainMode(db *models.Database) error {
|
||||
for _, domain := range db.Domains {
|
||||
data := NewDomainData(domain, db, w.options.Metadata)
|
||||
output, err := w.executeTemplate(data)
|
||||
if err != nil {
|
||||
return fmt.Errorf("failed to execute template for domain %s: %w", domain.Name, err)
|
||||
}
|
||||
|
||||
filename, err := w.generateFilename(data)
|
||||
if err != nil {
|
||||
return fmt.Errorf("failed to generate filename for domain %s: %w", domain.Name, err)
|
||||
}
|
||||
|
||||
if err := w.writeOutput(output, filename); err != nil {
|
||||
return fmt.Errorf("failed to write output for domain %s: %w", domain.Name, err)
|
||||
}
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
// executeScriptMode executes the template once per script
|
||||
func (w *Writer) executeScriptMode(db *models.Database) error {
|
||||
for _, schema := range db.Schemas {
|
||||
|
||||
@@ -1,6 +1,9 @@
|
||||
package writers
|
||||
|
||||
import (
|
||||
"regexp"
|
||||
"strings"
|
||||
|
||||
"git.warky.dev/wdevs/relspecgo/pkg/models"
|
||||
)
|
||||
|
||||
@@ -28,3 +31,56 @@ type WriterOptions struct {
|
||||
// Additional options can be added here as needed
|
||||
Metadata map[string]interface{}
|
||||
}
|
||||
|
||||
// SanitizeFilename removes quotes, comments, and invalid characters from identifiers
|
||||
// to make them safe for use in filenames. This handles:
|
||||
// - Double and single quotes: "table_name" or 'table_name' -> table_name
|
||||
// - DBML comments: table [note: 'description'] -> table
|
||||
// - Invalid filename characters: replaced with underscores
|
||||
func SanitizeFilename(name string) string {
|
||||
// Remove DBML/DCTX style comments in brackets (e.g., [note: 'description'])
|
||||
commentRegex := regexp.MustCompile(`\s*\[.*?\]\s*`)
|
||||
name = commentRegex.ReplaceAllString(name, "")
|
||||
|
||||
// Remove quotes (both single and double)
|
||||
name = strings.ReplaceAll(name, `"`, "")
|
||||
name = strings.ReplaceAll(name, `'`, "")
|
||||
|
||||
// Remove backticks (MySQL style identifiers)
|
||||
name = strings.ReplaceAll(name, "`", "")
|
||||
|
||||
// Replace invalid filename characters with underscores
|
||||
// Invalid chars: / \ : * ? " < > | and control characters
|
||||
invalidChars := regexp.MustCompile(`[/\\:*?"<>|\x00-\x1f\x7f]`)
|
||||
name = invalidChars.ReplaceAllString(name, "_")
|
||||
|
||||
// Trim whitespace and consecutive underscores
|
||||
name = strings.TrimSpace(name)
|
||||
name = regexp.MustCompile(`_+`).ReplaceAllString(name, "_")
|
||||
name = strings.Trim(name, "_")
|
||||
|
||||
return name
|
||||
}
|
||||
|
||||
// SanitizeStructTagValue sanitizes a value to be safely used inside Go struct tags.
|
||||
// Go struct tags are delimited by backticks, so any backtick in the value would break the syntax.
|
||||
// This function:
|
||||
// - Removes DBML/DCTX comments in brackets
|
||||
// - Removes all quotes (double, single, and backticks)
|
||||
// - Returns a clean identifier safe for use in struct tags and field names
|
||||
func SanitizeStructTagValue(value string) string {
|
||||
// Remove DBML/DCTX style comments in brackets (e.g., [note: 'description'])
|
||||
commentRegex := regexp.MustCompile(`\s*\[.*?\]\s*`)
|
||||
value = commentRegex.ReplaceAllString(value, "")
|
||||
|
||||
// Trim whitespace
|
||||
value = strings.TrimSpace(value)
|
||||
|
||||
// Remove all quotes: backticks, double quotes, and single quotes
|
||||
// This ensures the value is clean for use as Go identifiers and struct tag values
|
||||
value = strings.ReplaceAll(value, "`", "")
|
||||
value = strings.ReplaceAll(value, `"`, "")
|
||||
value = strings.ReplaceAll(value, `'`, "")
|
||||
|
||||
return value
|
||||
}
|
||||
|
||||
5
tests/assets/dbml/multifile/1_users.dbml
Normal file
5
tests/assets/dbml/multifile/1_users.dbml
Normal file
@@ -0,0 +1,5 @@
|
||||
// First file - users table basic structure
|
||||
Table public.users {
|
||||
id bigint [pk, increment]
|
||||
email varchar(255) [unique, not null]
|
||||
}
|
||||
8
tests/assets/dbml/multifile/2_posts.dbml
Normal file
8
tests/assets/dbml/multifile/2_posts.dbml
Normal file
@@ -0,0 +1,8 @@
|
||||
// Second file - posts table
|
||||
Table public.posts {
|
||||
id bigint [pk, increment]
|
||||
user_id bigint [not null]
|
||||
title varchar(200) [not null]
|
||||
content text
|
||||
created_at timestamp [not null]
|
||||
}
|
||||
5
tests/assets/dbml/multifile/3_add_columns.dbml
Normal file
5
tests/assets/dbml/multifile/3_add_columns.dbml
Normal file
@@ -0,0 +1,5 @@
|
||||
// Third file - adds more columns to users table (tests merging)
|
||||
Table public.users {
|
||||
name varchar(100)
|
||||
created_at timestamp [not null]
|
||||
}
|
||||
10
tests/assets/dbml/multifile/9_refs.dbml
Normal file
10
tests/assets/dbml/multifile/9_refs.dbml
Normal file
@@ -0,0 +1,10 @@
|
||||
// File with commented-out refs - should load last
|
||||
// Contains relationships that depend on earlier tables
|
||||
|
||||
// Ref: public.posts.user_id > public.users.id [ondelete: CASCADE]
|
||||
|
||||
Table public.comments {
|
||||
id bigint [pk, increment]
|
||||
post_id bigint [not null]
|
||||
content text [not null]
|
||||
}
|
||||
13
vendor/github.com/gdamore/encoding/.appveyor.yml
generated
vendored
Normal file
13
vendor/github.com/gdamore/encoding/.appveyor.yml
generated
vendored
Normal file
@@ -0,0 +1,13 @@
|
||||
version: 1.0.{build}
|
||||
clone_folder: c:\gopath\src\github.com\gdamore\encoding
|
||||
environment:
|
||||
GOPATH: c:\gopath
|
||||
build_script:
|
||||
- go version
|
||||
- go env
|
||||
- SET PATH=%LOCALAPPDATA%\atom\bin;%GOPATH%\bin;%PATH%
|
||||
- go get -t ./...
|
||||
- go build
|
||||
- go install ./...
|
||||
test_script:
|
||||
- go test ./...
|
||||
73
vendor/github.com/gdamore/encoding/CODE_OF_CONDUCT.md
generated
vendored
Normal file
73
vendor/github.com/gdamore/encoding/CODE_OF_CONDUCT.md
generated
vendored
Normal file
@@ -0,0 +1,73 @@
|
||||
# Contributor Covenant Code of Conduct
|
||||
|
||||
## Our Pledge
|
||||
|
||||
In the interest of fostering an open and welcoming environment, we as
|
||||
contributors and maintainers pledge to making participation in our project and
|
||||
our community a harassment-free experience for everyone, regardless of age, body
|
||||
size, disability, ethnicity, gender identity and expression, level of experience,
|
||||
nationality, personal appearance, race, religion, or sexual identity and
|
||||
orientation.
|
||||
|
||||
## Our Standards
|
||||
|
||||
Examples of behavior that contributes to creating a positive environment
|
||||
include:
|
||||
|
||||
* Using welcoming and inclusive language
|
||||
* Being respectful of differing viewpoints and experiences
|
||||
* Gracefully accepting constructive criticism
|
||||
* Focusing on what is best for the community
|
||||
* Showing empathy towards other community members
|
||||
|
||||
Examples of unacceptable behavior by participants include:
|
||||
|
||||
* The use of sexualized language or imagery and unwelcome sexual attention or
|
||||
advances
|
||||
* Trolling, insulting/derogatory comments, and personal or political attacks
|
||||
* Public or private harassment
|
||||
* Publishing others' private information, such as a physical or electronic
|
||||
address, without explicit permission
|
||||
* Other conduct which could reasonably be considered inappropriate in a
|
||||
professional setting
|
||||
|
||||
## Our Responsibilities
|
||||
|
||||
Project maintainers are responsible for clarifying the standards of acceptable
|
||||
behavior and are expected to take appropriate and fair corrective action in
|
||||
response to any instances of unacceptable behavior.
|
||||
|
||||
Project maintainers have the right and responsibility to remove, edit, or
|
||||
reject comments, commits, code, wiki edits, issues, and other contributions
|
||||
that are not aligned to this Code of Conduct, or to ban temporarily or
|
||||
permanently any contributor for other behaviors that they deem inappropriate,
|
||||
threatening, offensive, or harmful.
|
||||
|
||||
## Scope
|
||||
|
||||
This Code of Conduct applies both within project spaces and in public spaces
|
||||
when an individual is representing the project or its community. Examples of
|
||||
representing a project or community include using an official project e-mail
|
||||
address, posting via an official social media account, or acting as an appointed
|
||||
representative at an online or offline event. Representation of a project may be
|
||||
further defined and clarified by project maintainers.
|
||||
|
||||
## Enforcement
|
||||
|
||||
Instances of abusive, harassing, or otherwise unacceptable behavior may be
|
||||
reported by contacting the project team at garrett@damore.org. All
|
||||
complaints will be reviewed and investigated and will result in a response that
|
||||
is deemed necessary and appropriate to the circumstances. The project team is
|
||||
obligated to maintain confidentiality with regard to the reporter of an incident.
|
||||
Further details of specific enforcement policies may be posted separately.
|
||||
|
||||
Project maintainers who do not follow or enforce the Code of Conduct in good
|
||||
faith may face temporary or permanent repercussions as determined by other
|
||||
members of the project's leadership.
|
||||
|
||||
## Attribution
|
||||
|
||||
This Code of Conduct is adapted from the [Contributor Covenant][homepage], version 1.4,
|
||||
available at https://www.contributor-covenant.org/version/1/4/code-of-conduct.html
|
||||
|
||||
[homepage]: https://www.contributor-covenant.org
|
||||
202
vendor/github.com/gdamore/encoding/LICENSE
generated
vendored
Normal file
202
vendor/github.com/gdamore/encoding/LICENSE
generated
vendored
Normal file
@@ -0,0 +1,202 @@
|
||||
|
||||
Apache License
|
||||
Version 2.0, January 2004
|
||||
http://www.apache.org/licenses/
|
||||
|
||||
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
|
||||
|
||||
1. Definitions.
|
||||
|
||||
"License" shall mean the terms and conditions for use, reproduction,
|
||||
and distribution as defined by Sections 1 through 9 of this document.
|
||||
|
||||
"Licensor" shall mean the copyright owner or entity authorized by
|
||||
the copyright owner that is granting the License.
|
||||
|
||||
"Legal Entity" shall mean the union of the acting entity and all
|
||||
other entities that control, are controlled by, or are under common
|
||||
control with that entity. For the purposes of this definition,
|
||||
"control" means (i) the power, direct or indirect, to cause the
|
||||
direction or management of such entity, whether by contract or
|
||||
otherwise, or (ii) ownership of fifty percent (50%) or more of the
|
||||
outstanding shares, or (iii) beneficial ownership of such entity.
|
||||
|
||||
"You" (or "Your") shall mean an individual or Legal Entity
|
||||
exercising permissions granted by this License.
|
||||
|
||||
"Source" form shall mean the preferred form for making modifications,
|
||||
including but not limited to software source code, documentation
|
||||
source, and configuration files.
|
||||
|
||||
"Object" form shall mean any form resulting from mechanical
|
||||
transformation or translation of a Source form, including but
|
||||
not limited to compiled object code, generated documentation,
|
||||
and conversions to other media types.
|
||||
|
||||
"Work" shall mean the work of authorship, whether in Source or
|
||||
Object form, made available under the License, as indicated by a
|
||||
copyright notice that is included in or attached to the work
|
||||
(an example is provided in the Appendix below).
|
||||
|
||||
"Derivative Works" shall mean any work, whether in Source or Object
|
||||
form, that is based on (or derived from) the Work and for which the
|
||||
editorial revisions, annotations, elaborations, or other modifications
|
||||
represent, as a whole, an original work of authorship. For the purposes
|
||||
of this License, Derivative Works shall not include works that remain
|
||||
separable from, or merely link (or bind by name) to the interfaces of,
|
||||
the Work and Derivative Works thereof.
|
||||
|
||||
"Contribution" shall mean any work of authorship, including
|
||||
the original version of the Work and any modifications or additions
|
||||
to that Work or Derivative Works thereof, that is intentionally
|
||||
submitted to Licensor for inclusion in the Work by the copyright owner
|
||||
or by an individual or Legal Entity authorized to submit on behalf of
|
||||
the copyright owner. For the purposes of this definition, "submitted"
|
||||
means any form of electronic, verbal, or written communication sent
|
||||
to the Licensor or its representatives, including but not limited to
|
||||
communication on electronic mailing lists, source code control systems,
|
||||
and issue tracking systems that are managed by, or on behalf of, the
|
||||
Licensor for the purpose of discussing and improving the Work, but
|
||||
excluding communication that is conspicuously marked or otherwise
|
||||
designated in writing by the copyright owner as "Not a Contribution."
|
||||
|
||||
"Contributor" shall mean Licensor and any individual or Legal Entity
|
||||
on behalf of whom a Contribution has been received by Licensor and
|
||||
subsequently incorporated within the Work.
|
||||
|
||||
2. Grant of Copyright License. Subject to the terms and conditions of
|
||||
this License, each Contributor hereby grants to You a perpetual,
|
||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||
copyright license to reproduce, prepare Derivative Works of,
|
||||
publicly display, publicly perform, sublicense, and distribute the
|
||||
Work and such Derivative Works in Source or Object form.
|
||||
|
||||
3. Grant of Patent License. Subject to the terms and conditions of
|
||||
this License, each Contributor hereby grants to You a perpetual,
|
||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||
(except as stated in this section) patent license to make, have made,
|
||||
use, offer to sell, sell, import, and otherwise transfer the Work,
|
||||
where such license applies only to those patent claims licensable
|
||||
by such Contributor that are necessarily infringed by their
|
||||
Contribution(s) alone or by combination of their Contribution(s)
|
||||
with the Work to which such Contribution(s) was submitted. If You
|
||||
institute patent litigation against any entity (including a
|
||||
cross-claim or counterclaim in a lawsuit) alleging that the Work
|
||||
or a Contribution incorporated within the Work constitutes direct
|
||||
or contributory patent infringement, then any patent licenses
|
||||
granted to You under this License for that Work shall terminate
|
||||
as of the date such litigation is filed.
|
||||
|
||||
4. Redistribution. You may reproduce and distribute copies of the
|
||||
Work or Derivative Works thereof in any medium, with or without
|
||||
modifications, and in Source or Object form, provided that You
|
||||
meet the following conditions:
|
||||
|
||||
(a) You must give any other recipients of the Work or
|
||||
Derivative Works a copy of this License; and
|
||||
|
||||
(b) You must cause any modified files to carry prominent notices
|
||||
stating that You changed the files; and
|
||||
|
||||
(c) You must retain, in the Source form of any Derivative Works
|
||||
that You distribute, all copyright, patent, trademark, and
|
||||
attribution notices from the Source form of the Work,
|
||||
excluding those notices that do not pertain to any part of
|
||||
the Derivative Works; and
|
||||
|
||||
(d) If the Work includes a "NOTICE" text file as part of its
|
||||
distribution, then any Derivative Works that You distribute must
|
||||
include a readable copy of the attribution notices contained
|
||||
within such NOTICE file, excluding those notices that do not
|
||||
pertain to any part of the Derivative Works, in at least one
|
||||
of the following places: within a NOTICE text file distributed
|
||||
as part of the Derivative Works; within the Source form or
|
||||
documentation, if provided along with the Derivative Works; or,
|
||||
within a display generated by the Derivative Works, if and
|
||||
wherever such third-party notices normally appear. The contents
|
||||
of the NOTICE file are for informational purposes only and
|
||||
do not modify the License. You may add Your own attribution
|
||||
notices within Derivative Works that You distribute, alongside
|
||||
or as an addendum to the NOTICE text from the Work, provided
|
||||
that such additional attribution notices cannot be construed
|
||||
as modifying the License.
|
||||
|
||||
You may add Your own copyright statement to Your modifications and
|
||||
may provide additional or different license terms and conditions
|
||||
for use, reproduction, or distribution of Your modifications, or
|
||||
for any such Derivative Works as a whole, provided Your use,
|
||||
reproduction, and distribution of the Work otherwise complies with
|
||||
the conditions stated in this License.
|
||||
|
||||
5. Submission of Contributions. Unless You explicitly state otherwise,
|
||||
any Contribution intentionally submitted for inclusion in the Work
|
||||
by You to the Licensor shall be under the terms and conditions of
|
||||
this License, without any additional terms or conditions.
|
||||
Notwithstanding the above, nothing herein shall supersede or modify
|
||||
the terms of any separate license agreement you may have executed
|
||||
with Licensor regarding such Contributions.
|
||||
|
||||
6. Trademarks. This License does not grant permission to use the trade
|
||||
names, trademarks, service marks, or product names of the Licensor,
|
||||
except as required for reasonable and customary use in describing the
|
||||
origin of the Work and reproducing the content of the NOTICE file.
|
||||
|
||||
7. Disclaimer of Warranty. Unless required by applicable law or
|
||||
agreed to in writing, Licensor provides the Work (and each
|
||||
Contributor provides its Contributions) on an "AS IS" BASIS,
|
||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
|
||||
implied, including, without limitation, any warranties or conditions
|
||||
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
|
||||
PARTICULAR PURPOSE. You are solely responsible for determining the
|
||||
appropriateness of using or redistributing the Work and assume any
|
||||
risks associated with Your exercise of permissions under this License.
|
||||
|
||||
8. Limitation of Liability. In no event and under no legal theory,
|
||||
whether in tort (including negligence), contract, or otherwise,
|
||||
unless required by applicable law (such as deliberate and grossly
|
||||
negligent acts) or agreed to in writing, shall any Contributor be
|
||||
liable to You for damages, including any direct, indirect, special,
|
||||
incidental, or consequential damages of any character arising as a
|
||||
result of this License or out of the use or inability to use the
|
||||
Work (including but not limited to damages for loss of goodwill,
|
||||
work stoppage, computer failure or malfunction, or any and all
|
||||
other commercial damages or losses), even if such Contributor
|
||||
has been advised of the possibility of such damages.
|
||||
|
||||
9. Accepting Warranty or Additional Liability. While redistributing
|
||||
the Work or Derivative Works thereof, You may choose to offer,
|
||||
and charge a fee for, acceptance of support, warranty, indemnity,
|
||||
or other liability obligations and/or rights consistent with this
|
||||
License. However, in accepting such obligations, You may act only
|
||||
on Your own behalf and on Your sole responsibility, not on behalf
|
||||
of any other Contributor, and only if You agree to indemnify,
|
||||
defend, and hold each Contributor harmless for any liability
|
||||
incurred by, or claims asserted against, such Contributor by reason
|
||||
of your accepting any such warranty or additional liability.
|
||||
|
||||
END OF TERMS AND CONDITIONS
|
||||
|
||||
APPENDIX: How to apply the Apache License to your work.
|
||||
|
||||
To apply the Apache License to your work, attach the following
|
||||
boilerplate notice, with the fields enclosed by brackets "[]"
|
||||
replaced with your own identifying information. (Don't include
|
||||
the brackets!) The text should be enclosed in the appropriate
|
||||
comment syntax for the file format. We also recommend that a
|
||||
file or class name and description of purpose be included on the
|
||||
same "printed page" as the copyright notice for easier
|
||||
identification within third-party archives.
|
||||
|
||||
Copyright [yyyy] [name of copyright owner]
|
||||
|
||||
Licensed under the Apache License, Version 2.0 (the "License");
|
||||
you may not use this file except in compliance with the License.
|
||||
You may obtain a copy of the License at
|
||||
|
||||
http://www.apache.org/licenses/LICENSE-2.0
|
||||
|
||||
Unless required by applicable law or agreed to in writing, software
|
||||
distributed under the License is distributed on an "AS IS" BASIS,
|
||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
See the License for the specific language governing permissions and
|
||||
limitations under the License.
|
||||
20
vendor/github.com/gdamore/encoding/README.md
generated
vendored
Normal file
20
vendor/github.com/gdamore/encoding/README.md
generated
vendored
Normal file
@@ -0,0 +1,20 @@
|
||||
## encoding
|
||||
|
||||
|
||||
[](https://github.com/gdamore/encoding/actions/workflows/linux.yml)
|
||||
[](https://github.com/gdamore/encoding/actions/workflows/windows.yml)
|
||||
[](https://github.com/gdamore/encoding/blob/master/LICENSE)
|
||||
[](https://codecov.io/gh/gdamore/encoding)
|
||||
[](https://godoc.org/github.com/gdamore/encoding)
|
||||
|
||||
Package encoding provides a number of encodings that are missing from the
|
||||
standard Go [encoding]("https://godoc.org/golang.org/x/text/encoding") package.
|
||||
|
||||
We hope that we can contribute these to the standard Go library someday. It
|
||||
turns out that some of these are useful for dealing with I/O streams coming
|
||||
from non-UTF friendly sources.
|
||||
|
||||
The UTF8 Encoder is also useful for situations where valid UTF-8 might be
|
||||
carried in streams that contain non-valid UTF; in particular I use it for
|
||||
helping me cope with terminals that embed escape sequences in otherwise
|
||||
valid UTF-8.
|
||||
12
vendor/github.com/gdamore/encoding/SECURITY.md
generated
vendored
Normal file
12
vendor/github.com/gdamore/encoding/SECURITY.md
generated
vendored
Normal file
@@ -0,0 +1,12 @@
|
||||
# Security Policy
|
||||
|
||||
We take security very seriously in mangos, since you may be using it in
|
||||
Internet-facing applications.
|
||||
|
||||
## Reporting a Vulnerability
|
||||
|
||||
To report a vulnerability, please contact us on our discord.
|
||||
You may also send an email to garrett@damore.org, or info@staysail.tech.
|
||||
|
||||
We will keep the reporter updated on any status updates on a regular basis,
|
||||
and will respond within two business days for any reported security issue.
|
||||
36
vendor/github.com/gdamore/encoding/ascii.go
generated
vendored
Normal file
36
vendor/github.com/gdamore/encoding/ascii.go
generated
vendored
Normal file
@@ -0,0 +1,36 @@
|
||||
// Copyright 2015 Garrett D'Amore
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 (the "License");
|
||||
// you may not use file except in compliance with the License.
|
||||
// You may obtain a copy of the license at
|
||||
//
|
||||
// http://www.apache.org/licenses/LICENSE-2.0
|
||||
//
|
||||
// Unless required by applicable law or agreed to in writing, software
|
||||
// distributed under the License is distributed on an "AS IS" BASIS,
|
||||
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
// See the License for the specific language governing permissions and
|
||||
// limitations under the License.
|
||||
|
||||
package encoding
|
||||
|
||||
import (
|
||||
"golang.org/x/text/encoding"
|
||||
)
|
||||
|
||||
// ASCII represents the 7-bit US-ASCII scheme. It decodes directly to
|
||||
// UTF-8 without change, as all ASCII values are legal UTF-8.
|
||||
// Unicode values less than 128 (i.e. 7 bits) map 1:1 with ASCII.
|
||||
// It encodes runes outside of that to 0x1A, the ASCII substitution character.
|
||||
var ASCII encoding.Encoding
|
||||
|
||||
func init() {
|
||||
amap := make(map[byte]rune)
|
||||
for i := 128; i <= 255; i++ {
|
||||
amap[byte(i)] = RuneError
|
||||
}
|
||||
|
||||
cm := &Charmap{Map: amap}
|
||||
cm.Init()
|
||||
ASCII = cm
|
||||
}
|
||||
195
vendor/github.com/gdamore/encoding/charmap.go
generated
vendored
Normal file
195
vendor/github.com/gdamore/encoding/charmap.go
generated
vendored
Normal file
@@ -0,0 +1,195 @@
|
||||
// Copyright 2024 Garrett D'Amore
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 (the "License");
|
||||
// you may not use file except in compliance with the License.
|
||||
// You may obtain a copy of the license at
|
||||
//
|
||||
// http://www.apache.org/licenses/LICENSE-2.0
|
||||
//
|
||||
// Unless required by applicable law or agreed to in writing, software
|
||||
// distributed under the License is distributed on an "AS IS" BASIS,
|
||||
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
// See the License for the specific language governing permissions and
|
||||
// limitations under the License.
|
||||
|
||||
package encoding
|
||||
|
||||
import (
|
||||
"sync"
|
||||
"unicode/utf8"
|
||||
|
||||
"golang.org/x/text/encoding"
|
||||
"golang.org/x/text/transform"
|
||||
)
|
||||
|
||||
const (
|
||||
// RuneError is an alias for the UTF-8 replacement rune, '\uFFFD'.
|
||||
RuneError = '\uFFFD'
|
||||
|
||||
// RuneSelf is the rune below which UTF-8 and the Unicode values are
|
||||
// identical. Its also the limit for ASCII.
|
||||
RuneSelf = 0x80
|
||||
|
||||
// ASCIISub is the ASCII substitution character.
|
||||
ASCIISub = '\x1a'
|
||||
)
|
||||
|
||||
// Charmap is a structure for setting up encodings for 8-bit character sets,
|
||||
// for transforming between UTF8 and that other character set. It has some
|
||||
// ideas borrowed from golang.org/x/text/encoding/charmap, but it uses a
|
||||
// different implementation. This implementation uses maps, and supports
|
||||
// user-defined maps.
|
||||
//
|
||||
// We do assume that a character map has a reasonable substitution character,
|
||||
// and that valid encodings are stable (exactly a 1:1 map) and stateless
|
||||
// (that is there is no shift character or anything like that.) Hence this
|
||||
// approach will not work for many East Asian character sets.
|
||||
//
|
||||
// Measurement shows little or no measurable difference in the performance of
|
||||
// the two approaches. The difference was down to a couple of nsec/op, and
|
||||
// no consistent pattern as to which ran faster. With the conversion to
|
||||
// UTF-8 the code takes about 25 nsec/op. The conversion in the reverse
|
||||
// direction takes about 100 nsec/op. (The larger cost for conversion
|
||||
// from UTF-8 is most likely due to the need to convert the UTF-8 byte stream
|
||||
// to a rune before conversion.
|
||||
type Charmap struct {
|
||||
transform.NopResetter
|
||||
bytes map[rune]byte
|
||||
runes [256][]byte
|
||||
once sync.Once
|
||||
|
||||
// The map between bytes and runes. To indicate that a specific
|
||||
// byte value is invalid for a charcter set, use the rune
|
||||
// utf8.RuneError. Values that are absent from this map will
|
||||
// be assumed to have the identity mapping -- that is the default
|
||||
// is to assume ISO8859-1, where all 8-bit characters have the same
|
||||
// numeric value as their Unicode runes. (Not to be confused with
|
||||
// the UTF-8 values, which *will* be different for non-ASCII runes.)
|
||||
//
|
||||
// If no values less than RuneSelf are changed (or have non-identity
|
||||
// mappings), then the character set is assumed to be an ASCII
|
||||
// superset, and certain assumptions and optimizations become
|
||||
// available for ASCII bytes.
|
||||
Map map[byte]rune
|
||||
|
||||
// The ReplacementChar is the byte value to use for substitution.
|
||||
// It should normally be ASCIISub for ASCII encodings. This may be
|
||||
// unset (left to zero) for mappings that are strictly ASCII supersets.
|
||||
// In that case ASCIISub will be assumed instead.
|
||||
ReplacementChar byte
|
||||
}
|
||||
|
||||
type cmapDecoder struct {
|
||||
transform.NopResetter
|
||||
runes [256][]byte
|
||||
}
|
||||
|
||||
type cmapEncoder struct {
|
||||
transform.NopResetter
|
||||
bytes map[rune]byte
|
||||
replace byte
|
||||
}
|
||||
|
||||
// Init initializes internal values of a character map. This should
|
||||
// be done early, to minimize the cost of allocation of transforms
|
||||
// later. It is not strictly necessary however, as the allocation
|
||||
// functions will arrange to call it if it has not already been done.
|
||||
func (c *Charmap) Init() {
|
||||
c.once.Do(c.initialize)
|
||||
}
|
||||
|
||||
func (c *Charmap) initialize() {
|
||||
c.bytes = make(map[rune]byte)
|
||||
ascii := true
|
||||
|
||||
for i := 0; i < 256; i++ {
|
||||
r, ok := c.Map[byte(i)]
|
||||
if !ok {
|
||||
r = rune(i)
|
||||
}
|
||||
if r < 128 && r != rune(i) {
|
||||
ascii = false
|
||||
}
|
||||
if r != RuneError {
|
||||
c.bytes[r] = byte(i)
|
||||
}
|
||||
utf := make([]byte, utf8.RuneLen(r))
|
||||
utf8.EncodeRune(utf, r)
|
||||
c.runes[i] = utf
|
||||
}
|
||||
if ascii && c.ReplacementChar == '\x00' {
|
||||
c.ReplacementChar = ASCIISub
|
||||
}
|
||||
}
|
||||
|
||||
// NewDecoder returns a Decoder the converts from the 8-bit
|
||||
// character set to UTF-8. Unknown mappings, if any, are mapped
|
||||
// to '\uFFFD'.
|
||||
func (c *Charmap) NewDecoder() *encoding.Decoder {
|
||||
c.Init()
|
||||
return &encoding.Decoder{Transformer: &cmapDecoder{runes: c.runes}}
|
||||
}
|
||||
|
||||
// NewEncoder returns a Transformer that converts from UTF8 to the
|
||||
// 8-bit character set. Unknown mappings are mapped to 0x1A.
|
||||
func (c *Charmap) NewEncoder() *encoding.Encoder {
|
||||
c.Init()
|
||||
return &encoding.Encoder{
|
||||
Transformer: &cmapEncoder{
|
||||
bytes: c.bytes,
|
||||
replace: c.ReplacementChar,
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
func (d *cmapDecoder) Transform(dst, src []byte, atEOF bool) (int, int, error) {
|
||||
var e error
|
||||
var ndst, nsrc int
|
||||
|
||||
for _, c := range src {
|
||||
b := d.runes[c]
|
||||
l := len(b)
|
||||
|
||||
if ndst+l > len(dst) {
|
||||
e = transform.ErrShortDst
|
||||
break
|
||||
}
|
||||
for i := 0; i < l; i++ {
|
||||
dst[ndst] = b[i]
|
||||
ndst++
|
||||
}
|
||||
nsrc++
|
||||
}
|
||||
return ndst, nsrc, e
|
||||
}
|
||||
|
||||
func (d *cmapEncoder) Transform(dst, src []byte, atEOF bool) (int, int, error) {
|
||||
var e error
|
||||
var ndst, nsrc int
|
||||
for nsrc < len(src) {
|
||||
if ndst >= len(dst) {
|
||||
e = transform.ErrShortDst
|
||||
break
|
||||
}
|
||||
|
||||
r, sz := utf8.DecodeRune(src[nsrc:])
|
||||
if r == utf8.RuneError && sz == 1 {
|
||||
// If its inconclusive due to insufficient data in
|
||||
// in the source, report it
|
||||
if atEOF && !utf8.FullRune(src[nsrc:]) {
|
||||
e = transform.ErrShortSrc
|
||||
break
|
||||
}
|
||||
}
|
||||
|
||||
if c, ok := d.bytes[r]; ok {
|
||||
dst[ndst] = c
|
||||
} else {
|
||||
dst[ndst] = d.replace
|
||||
}
|
||||
nsrc += sz
|
||||
ndst++
|
||||
}
|
||||
|
||||
return ndst, nsrc, e
|
||||
}
|
||||
17
vendor/github.com/gdamore/encoding/doc.go
generated
vendored
Normal file
17
vendor/github.com/gdamore/encoding/doc.go
generated
vendored
Normal file
@@ -0,0 +1,17 @@
|
||||
// Copyright 2015 Garrett D'Amore
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 (the "License");
|
||||
// you may not use file except in compliance with the License.
|
||||
// You may obtain a copy of the license at
|
||||
//
|
||||
// http://www.apache.org/licenses/LICENSE-2.0
|
||||
//
|
||||
// Unless required by applicable law or agreed to in writing, software
|
||||
// distributed under the License is distributed on an "AS IS" BASIS,
|
||||
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
// See the License for the specific language governing permissions and
|
||||
// limitations under the License.
|
||||
|
||||
// Package encoding provides a few of the encoding structures that are
|
||||
// missing from the Go x/text/encoding tree.
|
||||
package encoding
|
||||
273
vendor/github.com/gdamore/encoding/ebcdic.go
generated
vendored
Normal file
273
vendor/github.com/gdamore/encoding/ebcdic.go
generated
vendored
Normal file
@@ -0,0 +1,273 @@
|
||||
// Copyright 2015 Garrett D'Amore
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 (the "License");
|
||||
// you may not use file except in compliance with the License.
|
||||
// You may obtain a copy of the license at
|
||||
//
|
||||
// http://www.apache.org/licenses/LICENSE-2.0
|
||||
//
|
||||
// Unless required by applicable law or agreed to in writing, software
|
||||
// distributed under the License is distributed on an "AS IS" BASIS,
|
||||
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
// See the License for the specific language governing permissions and
|
||||
// limitations under the License.
|
||||
|
||||
package encoding
|
||||
|
||||
import (
|
||||
"golang.org/x/text/encoding"
|
||||
)
|
||||
|
||||
// EBCDIC represents the 8-bit EBCDIC scheme, found in some mainframe
|
||||
// environments. If you don't know what this is, consider yourself lucky.
|
||||
var EBCDIC encoding.Encoding
|
||||
|
||||
func init() {
|
||||
cm := &Charmap{
|
||||
ReplacementChar: '\x3f',
|
||||
Map: map[byte]rune{
|
||||
// 0x00-0x03 match
|
||||
0x04: RuneError,
|
||||
0x05: '\t',
|
||||
0x06: RuneError,
|
||||
0x07: '\x7f',
|
||||
0x08: RuneError,
|
||||
0x09: RuneError,
|
||||
0x0a: RuneError,
|
||||
// 0x0b-0x13 match
|
||||
0x14: RuneError,
|
||||
0x15: '\x85', // Not in any ISO code
|
||||
0x16: '\x08',
|
||||
0x17: RuneError,
|
||||
// 0x18-0x19 match
|
||||
0x1a: RuneError,
|
||||
0x1b: RuneError,
|
||||
// 0x1c-0x1f match
|
||||
0x20: RuneError,
|
||||
0x21: RuneError,
|
||||
0x22: RuneError,
|
||||
0x23: RuneError,
|
||||
0x24: RuneError,
|
||||
0x25: '\n',
|
||||
0x26: '\x17',
|
||||
0x27: '\x1b',
|
||||
0x28: RuneError,
|
||||
0x29: RuneError,
|
||||
0x2a: RuneError,
|
||||
0x2b: RuneError,
|
||||
0x2c: RuneError,
|
||||
0x2d: '\x05',
|
||||
0x2e: '\x06',
|
||||
0x2f: '\x07',
|
||||
0x30: RuneError,
|
||||
0x31: RuneError,
|
||||
0x32: '\x16',
|
||||
0x33: RuneError,
|
||||
0x34: RuneError,
|
||||
0x35: RuneError,
|
||||
0x36: RuneError,
|
||||
0x37: '\x04',
|
||||
0x38: RuneError,
|
||||
0x39: RuneError,
|
||||
0x3a: RuneError,
|
||||
0x3b: RuneError,
|
||||
0x3c: '\x14',
|
||||
0x3d: '\x15',
|
||||
0x3e: RuneError,
|
||||
0x3f: '\x1a', // also replacement char
|
||||
0x40: ' ',
|
||||
0x41: '\xa0',
|
||||
0x42: RuneError,
|
||||
0x43: RuneError,
|
||||
0x44: RuneError,
|
||||
0x45: RuneError,
|
||||
0x46: RuneError,
|
||||
0x47: RuneError,
|
||||
0x48: RuneError,
|
||||
0x49: RuneError,
|
||||
0x4a: RuneError,
|
||||
0x4b: '.',
|
||||
0x4c: '<',
|
||||
0x4d: '(',
|
||||
0x4e: '+',
|
||||
0x4f: '|',
|
||||
0x50: '&',
|
||||
0x51: RuneError,
|
||||
0x52: RuneError,
|
||||
0x53: RuneError,
|
||||
0x54: RuneError,
|
||||
0x55: RuneError,
|
||||
0x56: RuneError,
|
||||
0x57: RuneError,
|
||||
0x58: RuneError,
|
||||
0x59: RuneError,
|
||||
0x5a: '!',
|
||||
0x5b: '$',
|
||||
0x5c: '*',
|
||||
0x5d: ')',
|
||||
0x5e: ';',
|
||||
0x5f: '¬',
|
||||
0x60: '-',
|
||||
0x61: '/',
|
||||
0x62: RuneError,
|
||||
0x63: RuneError,
|
||||
0x64: RuneError,
|
||||
0x65: RuneError,
|
||||
0x66: RuneError,
|
||||
0x67: RuneError,
|
||||
0x68: RuneError,
|
||||
0x69: RuneError,
|
||||
0x6a: '¦',
|
||||
0x6b: ',',
|
||||
0x6c: '%',
|
||||
0x6d: '_',
|
||||
0x6e: '>',
|
||||
0x6f: '?',
|
||||
0x70: RuneError,
|
||||
0x71: RuneError,
|
||||
0x72: RuneError,
|
||||
0x73: RuneError,
|
||||
0x74: RuneError,
|
||||
0x75: RuneError,
|
||||
0x76: RuneError,
|
||||
0x77: RuneError,
|
||||
0x78: RuneError,
|
||||
0x79: '`',
|
||||
0x7a: ':',
|
||||
0x7b: '#',
|
||||
0x7c: '@',
|
||||
0x7d: '\'',
|
||||
0x7e: '=',
|
||||
0x7f: '"',
|
||||
0x80: RuneError,
|
||||
0x81: 'a',
|
||||
0x82: 'b',
|
||||
0x83: 'c',
|
||||
0x84: 'd',
|
||||
0x85: 'e',
|
||||
0x86: 'f',
|
||||
0x87: 'g',
|
||||
0x88: 'h',
|
||||
0x89: 'i',
|
||||
0x8a: RuneError,
|
||||
0x8b: RuneError,
|
||||
0x8c: RuneError,
|
||||
0x8d: RuneError,
|
||||
0x8e: RuneError,
|
||||
0x8f: '±',
|
||||
0x90: RuneError,
|
||||
0x91: 'j',
|
||||
0x92: 'k',
|
||||
0x93: 'l',
|
||||
0x94: 'm',
|
||||
0x95: 'n',
|
||||
0x96: 'o',
|
||||
0x97: 'p',
|
||||
0x98: 'q',
|
||||
0x99: 'r',
|
||||
0x9a: RuneError,
|
||||
0x9b: RuneError,
|
||||
0x9c: RuneError,
|
||||
0x9d: RuneError,
|
||||
0x9e: RuneError,
|
||||
0x9f: RuneError,
|
||||
0xa0: RuneError,
|
||||
0xa1: '~',
|
||||
0xa2: 's',
|
||||
0xa3: 't',
|
||||
0xa4: 'u',
|
||||
0xa5: 'v',
|
||||
0xa6: 'w',
|
||||
0xa7: 'x',
|
||||
0xa8: 'y',
|
||||
0xa9: 'z',
|
||||
0xaa: RuneError,
|
||||
0xab: RuneError,
|
||||
0xac: RuneError,
|
||||
0xad: RuneError,
|
||||
0xae: RuneError,
|
||||
0xaf: RuneError,
|
||||
0xb0: '^',
|
||||
0xb1: RuneError,
|
||||
0xb2: RuneError,
|
||||
0xb3: RuneError,
|
||||
0xb4: RuneError,
|
||||
0xb5: RuneError,
|
||||
0xb6: RuneError,
|
||||
0xb7: RuneError,
|
||||
0xb8: RuneError,
|
||||
0xb9: RuneError,
|
||||
0xba: '[',
|
||||
0xbb: ']',
|
||||
0xbc: RuneError,
|
||||
0xbd: RuneError,
|
||||
0xbe: RuneError,
|
||||
0xbf: RuneError,
|
||||
0xc0: '{',
|
||||
0xc1: 'A',
|
||||
0xc2: 'B',
|
||||
0xc3: 'C',
|
||||
0xc4: 'D',
|
||||
0xc5: 'E',
|
||||
0xc6: 'F',
|
||||
0xc7: 'G',
|
||||
0xc8: 'H',
|
||||
0xc9: 'I',
|
||||
0xca: '\xad', // NB: soft hyphen
|
||||
0xcb: RuneError,
|
||||
0xcc: RuneError,
|
||||
0xcd: RuneError,
|
||||
0xce: RuneError,
|
||||
0xcf: RuneError,
|
||||
0xd0: '}',
|
||||
0xd1: 'J',
|
||||
0xd2: 'K',
|
||||
0xd3: 'L',
|
||||
0xd4: 'M',
|
||||
0xd5: 'N',
|
||||
0xd6: 'O',
|
||||
0xd7: 'P',
|
||||
0xd8: 'Q',
|
||||
0xd9: 'R',
|
||||
0xda: RuneError,
|
||||
0xdb: RuneError,
|
||||
0xdc: RuneError,
|
||||
0xdd: RuneError,
|
||||
0xde: RuneError,
|
||||
0xdf: RuneError,
|
||||
0xe0: '\\',
|
||||
0xe1: '\u2007', // Non-breaking space
|
||||
0xe2: 'S',
|
||||
0xe3: 'T',
|
||||
0xe4: 'U',
|
||||
0xe5: 'V',
|
||||
0xe6: 'W',
|
||||
0xe7: 'X',
|
||||
0xe8: 'Y',
|
||||
0xe9: 'Z',
|
||||
0xea: RuneError,
|
||||
0xeb: RuneError,
|
||||
0xec: RuneError,
|
||||
0xed: RuneError,
|
||||
0xee: RuneError,
|
||||
0xef: RuneError,
|
||||
0xf0: '0',
|
||||
0xf1: '1',
|
||||
0xf2: '2',
|
||||
0xf3: '3',
|
||||
0xf4: '4',
|
||||
0xf5: '5',
|
||||
0xf6: '6',
|
||||
0xf7: '7',
|
||||
0xf8: '8',
|
||||
0xf9: '9',
|
||||
0xfa: RuneError,
|
||||
0xfb: RuneError,
|
||||
0xfc: RuneError,
|
||||
0xfd: RuneError,
|
||||
0xfe: RuneError,
|
||||
0xff: RuneError,
|
||||
}}
|
||||
cm.Init()
|
||||
EBCDIC = cm
|
||||
}
|
||||
33
vendor/github.com/gdamore/encoding/latin1.go
generated
vendored
Normal file
33
vendor/github.com/gdamore/encoding/latin1.go
generated
vendored
Normal file
@@ -0,0 +1,33 @@
|
||||
// Copyright 2015 Garrett D'Amore
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 (the "License");
|
||||
// you may not use file except in compliance with the License.
|
||||
// You may obtain a copy of the license at
|
||||
//
|
||||
// http://www.apache.org/licenses/LICENSE-2.0
|
||||
//
|
||||
// Unless required by applicable law or agreed to in writing, software
|
||||
// distributed under the License is distributed on an "AS IS" BASIS,
|
||||
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
// See the License for the specific language governing permissions and
|
||||
// limitations under the License.
|
||||
|
||||
package encoding
|
||||
|
||||
import (
|
||||
"golang.org/x/text/encoding"
|
||||
)
|
||||
|
||||
// ISO8859_1 represents the 8-bit ISO8859-1 scheme. It decodes directly to
|
||||
// UTF-8 without change, as all ISO8859-1 values are legal UTF-8.
|
||||
// Unicode values less than 256 (i.e. 8 bits) map 1:1 with 8859-1.
|
||||
// It encodes runes outside of that to 0x1A, the ASCII substitution character.
|
||||
var ISO8859_1 encoding.Encoding
|
||||
|
||||
func init() {
|
||||
cm := &Charmap{}
|
||||
cm.Init()
|
||||
|
||||
// 8859-1 is the 8-bit identity map for Unicode.
|
||||
ISO8859_1 = cm
|
||||
}
|
||||
35
vendor/github.com/gdamore/encoding/latin5.go
generated
vendored
Normal file
35
vendor/github.com/gdamore/encoding/latin5.go
generated
vendored
Normal file
@@ -0,0 +1,35 @@
|
||||
// Copyright 2015 Garrett D'Amore
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 (the "License");
|
||||
// you may not use file except in compliance with the License.
|
||||
// You may obtain a copy of the license at
|
||||
//
|
||||
// http://www.apache.org/licenses/LICENSE-2.0
|
||||
//
|
||||
// Unless required by applicable law or agreed to in writing, software
|
||||
// distributed under the License is distributed on an "AS IS" BASIS,
|
||||
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
// See the License for the specific language governing permissions and
|
||||
// limitations under the License.
|
||||
|
||||
package encoding
|
||||
|
||||
import (
|
||||
"golang.org/x/text/encoding"
|
||||
)
|
||||
|
||||
// ISO8859_9 represents the 8-bit ISO8859-9 scheme.
|
||||
var ISO8859_9 encoding.Encoding
|
||||
|
||||
func init() {
|
||||
cm := &Charmap{Map: map[byte]rune{
|
||||
0xD0: 'Ğ',
|
||||
0xDD: 'İ',
|
||||
0xDE: 'Ş',
|
||||
0xF0: 'ğ',
|
||||
0xFD: 'ı',
|
||||
0xFE: 'ş',
|
||||
}}
|
||||
cm.Init()
|
||||
ISO8859_9 = cm
|
||||
}
|
||||
35
vendor/github.com/gdamore/encoding/utf8.go
generated
vendored
Normal file
35
vendor/github.com/gdamore/encoding/utf8.go
generated
vendored
Normal file
@@ -0,0 +1,35 @@
|
||||
// Copyright 2015 Garrett D'Amore
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 (the "License");
|
||||
// you may not use file except in compliance with the License.
|
||||
// You may obtain a copy of the license at
|
||||
//
|
||||
// http://www.apache.org/licenses/LICENSE-2.0
|
||||
//
|
||||
// Unless required by applicable law or agreed to in writing, software
|
||||
// distributed under the License is distributed on an "AS IS" BASIS,
|
||||
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
// See the License for the specific language governing permissions and
|
||||
// limitations under the License.
|
||||
|
||||
package encoding
|
||||
|
||||
import (
|
||||
"golang.org/x/text/encoding"
|
||||
)
|
||||
|
||||
type validUtf8 struct{}
|
||||
|
||||
// UTF8 is an encoding for UTF-8. All it does is verify that the UTF-8
|
||||
// in is valid. The main reason for its existence is that it will detect
|
||||
// and report ErrSrcShort or ErrDstShort, whereas the Nop encoding just
|
||||
// passes every byte, blithely.
|
||||
var UTF8 encoding.Encoding = validUtf8{}
|
||||
|
||||
func (validUtf8) NewDecoder() *encoding.Decoder {
|
||||
return &encoding.Decoder{Transformer: encoding.UTF8Validator}
|
||||
}
|
||||
|
||||
func (validUtf8) NewEncoder() *encoding.Encoder {
|
||||
return &encoding.Encoder{Transformer: encoding.UTF8Validator}
|
||||
}
|
||||
13
vendor/github.com/gdamore/tcell/v2/.appveyor.yml
generated
vendored
Normal file
13
vendor/github.com/gdamore/tcell/v2/.appveyor.yml
generated
vendored
Normal file
@@ -0,0 +1,13 @@
|
||||
version: 1.0.{build}
|
||||
clone_folder: c:\gopath\src\github.com\gdamore\tcell
|
||||
environment:
|
||||
GOPATH: c:\gopath
|
||||
build_script:
|
||||
- go version
|
||||
- go env
|
||||
- SET PATH=%LOCALAPPDATA%\atom\bin;%GOPATH%\bin;%PATH%
|
||||
- go get -t ./...
|
||||
- go build
|
||||
- go install ./...
|
||||
test_script:
|
||||
- go test ./...
|
||||
1
vendor/github.com/gdamore/tcell/v2/.gitignore
generated
vendored
Normal file
1
vendor/github.com/gdamore/tcell/v2/.gitignore
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
coverage.txt
|
||||
18
vendor/github.com/gdamore/tcell/v2/.travis.yml
generated
vendored
Normal file
18
vendor/github.com/gdamore/tcell/v2/.travis.yml
generated
vendored
Normal file
@@ -0,0 +1,18 @@
|
||||
language: go
|
||||
|
||||
go:
|
||||
- 1.15.x
|
||||
- master
|
||||
|
||||
arch:
|
||||
- amd64
|
||||
- ppc64le
|
||||
|
||||
before_install:
|
||||
- go get -t -v ./...
|
||||
|
||||
script:
|
||||
- go test -race -coverprofile=coverage.txt -covermode=atomic
|
||||
|
||||
after_success:
|
||||
- bash <(curl -s https://codecov.io/bash)
|
||||
4
vendor/github.com/gdamore/tcell/v2/AUTHORS
generated
vendored
Normal file
4
vendor/github.com/gdamore/tcell/v2/AUTHORS
generated
vendored
Normal file
@@ -0,0 +1,4 @@
|
||||
Garrett D'Amore <garrett@damore.org>
|
||||
Zachary Yedidia <zyedidia@gmail.com>
|
||||
Junegunn Choi <junegunn.c@gmail.com>
|
||||
Staysail Systems, Inc. <info@staysail.tech>
|
||||
82
vendor/github.com/gdamore/tcell/v2/CHANGESv2.md
generated
vendored
Normal file
82
vendor/github.com/gdamore/tcell/v2/CHANGESv2.md
generated
vendored
Normal file
@@ -0,0 +1,82 @@
|
||||
## Breaking Changes in _Tcell_ v2
|
||||
|
||||
A number of changes were made to _Tcell_ for version two, and some of these are breaking.
|
||||
|
||||
### Import Path
|
||||
|
||||
The import path for tcell has changed to `github.com/gdamore/tcell/v2` to reflect a new major version.
|
||||
|
||||
### Style Is Not Numeric
|
||||
|
||||
The type `Style` has changed to a structure, to allow us to add additional data such as flags for color setting,
|
||||
more attribute bits, and so forth.
|
||||
Applications that relied on this being a number will need to be updated to use the accessor methods.
|
||||
|
||||
### Mouse Event Changes
|
||||
|
||||
The middle mouse button was reported as button 2 on Linux, but as button 3 on Windows,
|
||||
and the right mouse button was reported the reverse way.
|
||||
_Tcell_ now always reports the right mouse button as button 2, and the middle button as button 3.
|
||||
To help make this clearer, new symbols `ButtonPrimary`, `ButtonSecondary`, and
|
||||
`ButtonMiddle` are provided.
|
||||
(Note that which button is right vs. left may be impacted by user preferences.
|
||||
Usually the left button will be considered the Primary, and the right will be the Secondary.)
|
||||
Applications may need to adjust their handling of mouse buttons 2 and 3 accordingly.
|
||||
|
||||
### Terminals Removed
|
||||
|
||||
A number of terminals have been removed.
|
||||
These are mostly ancient definitions unlikely to be used by anyone, such as `adm3a`.
|
||||
|
||||
### High Number Function Keys
|
||||
|
||||
Historically terminfo reported function keys with modifiers set as a different
|
||||
function key altogether. For example, Shift-F1 was reported as F13 on XTerm.
|
||||
_Tcell_ now prefers to report these using the base key (such as F1) with modifiers added.
|
||||
This works on XTerm and VTE based emulators, but some emulators may not support this.
|
||||
The new behavior more closely aligns with behavior on Windows platforms.
|
||||
|
||||
## New Features in _Tcell_ v2
|
||||
|
||||
These features are not breaking, but are introduced in version 2.
|
||||
|
||||
### Improved Modifier Support
|
||||
|
||||
For terminals that appear to behave like the venerable XTerm, _tcell_
|
||||
automatically adds modifier reporting for ALT, CTRL, SHIFT, and META keys
|
||||
when the terminal reports them.
|
||||
|
||||
### Better Support for Palettes (Themes)
|
||||
|
||||
When using a color by its name or palette entry, _Tcell_ now tries to
|
||||
use that palette entry as is; this should avoid some inconsistency and respect
|
||||
terminal themes correctly.
|
||||
|
||||
When true fidelity to RGB values is needed, the new `TrueColor()` API can be used
|
||||
to create a direct color, which bypasses the palette altogether.
|
||||
|
||||
### Automatic TrueColor Detection
|
||||
|
||||
For some terminals, if the `Tc` or `RGB` properties are present in terminfo,
|
||||
_Tcell_ will automatically assume the terminal supports 24-bit color.
|
||||
|
||||
### ColorReset
|
||||
|
||||
A new color value, `ColorReset` can be used on the foreground or background
|
||||
to reset the color the default used by the terminal.
|
||||
|
||||
### tmux Support
|
||||
|
||||
_Tcell_ now has improved support for tmux, when the `$TERM` variable is set to "tmux".
|
||||
|
||||
### Strikethrough Support
|
||||
|
||||
_Tcell_ has support for strikethrough when the terminal supports it, using the new `StrikeThrough()` API.
|
||||
|
||||
### Bracketed Paste Support
|
||||
|
||||
_Tcell_ provides the long requested capability to discriminate paste event by using the
|
||||
bracketed-paste capability present in some terminals. This is automatically available on
|
||||
terminals that support XTerm style mouse handling, but applications must opt-in to this
|
||||
by using the new `EnablePaste()` function. A new `EventPaste` type of event will be
|
||||
delivered when starting and finishing a paste operation.
|
||||
202
vendor/github.com/gdamore/tcell/v2/LICENSE
generated
vendored
Normal file
202
vendor/github.com/gdamore/tcell/v2/LICENSE
generated
vendored
Normal file
@@ -0,0 +1,202 @@
|
||||
|
||||
Apache License
|
||||
Version 2.0, January 2004
|
||||
http://www.apache.org/licenses/
|
||||
|
||||
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
|
||||
|
||||
1. Definitions.
|
||||
|
||||
"License" shall mean the terms and conditions for use, reproduction,
|
||||
and distribution as defined by Sections 1 through 9 of this document.
|
||||
|
||||
"Licensor" shall mean the copyright owner or entity authorized by
|
||||
the copyright owner that is granting the License.
|
||||
|
||||
"Legal Entity" shall mean the union of the acting entity and all
|
||||
other entities that control, are controlled by, or are under common
|
||||
control with that entity. For the purposes of this definition,
|
||||
"control" means (i) the power, direct or indirect, to cause the
|
||||
direction or management of such entity, whether by contract or
|
||||
otherwise, or (ii) ownership of fifty percent (50%) or more of the
|
||||
outstanding shares, or (iii) beneficial ownership of such entity.
|
||||
|
||||
"You" (or "Your") shall mean an individual or Legal Entity
|
||||
exercising permissions granted by this License.
|
||||
|
||||
"Source" form shall mean the preferred form for making modifications,
|
||||
including but not limited to software source code, documentation
|
||||
source, and configuration files.
|
||||
|
||||
"Object" form shall mean any form resulting from mechanical
|
||||
transformation or translation of a Source form, including but
|
||||
not limited to compiled object code, generated documentation,
|
||||
and conversions to other media types.
|
||||
|
||||
"Work" shall mean the work of authorship, whether in Source or
|
||||
Object form, made available under the License, as indicated by a
|
||||
copyright notice that is included in or attached to the work
|
||||
(an example is provided in the Appendix below).
|
||||
|
||||
"Derivative Works" shall mean any work, whether in Source or Object
|
||||
form, that is based on (or derived from) the Work and for which the
|
||||
editorial revisions, annotations, elaborations, or other modifications
|
||||
represent, as a whole, an original work of authorship. For the purposes
|
||||
of this License, Derivative Works shall not include works that remain
|
||||
separable from, or merely link (or bind by name) to the interfaces of,
|
||||
the Work and Derivative Works thereof.
|
||||
|
||||
"Contribution" shall mean any work of authorship, including
|
||||
the original version of the Work and any modifications or additions
|
||||
to that Work or Derivative Works thereof, that is intentionally
|
||||
submitted to Licensor for inclusion in the Work by the copyright owner
|
||||
or by an individual or Legal Entity authorized to submit on behalf of
|
||||
the copyright owner. For the purposes of this definition, "submitted"
|
||||
means any form of electronic, verbal, or written communication sent
|
||||
to the Licensor or its representatives, including but not limited to
|
||||
communication on electronic mailing lists, source code control systems,
|
||||
and issue tracking systems that are managed by, or on behalf of, the
|
||||
Licensor for the purpose of discussing and improving the Work, but
|
||||
excluding communication that is conspicuously marked or otherwise
|
||||
designated in writing by the copyright owner as "Not a Contribution."
|
||||
|
||||
"Contributor" shall mean Licensor and any individual or Legal Entity
|
||||
on behalf of whom a Contribution has been received by Licensor and
|
||||
subsequently incorporated within the Work.
|
||||
|
||||
2. Grant of Copyright License. Subject to the terms and conditions of
|
||||
this License, each Contributor hereby grants to You a perpetual,
|
||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||
copyright license to reproduce, prepare Derivative Works of,
|
||||
publicly display, publicly perform, sublicense, and distribute the
|
||||
Work and such Derivative Works in Source or Object form.
|
||||
|
||||
3. Grant of Patent License. Subject to the terms and conditions of
|
||||
this License, each Contributor hereby grants to You a perpetual,
|
||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||
(except as stated in this section) patent license to make, have made,
|
||||
use, offer to sell, sell, import, and otherwise transfer the Work,
|
||||
where such license applies only to those patent claims licensable
|
||||
by such Contributor that are necessarily infringed by their
|
||||
Contribution(s) alone or by combination of their Contribution(s)
|
||||
with the Work to which such Contribution(s) was submitted. If You
|
||||
institute patent litigation against any entity (including a
|
||||
cross-claim or counterclaim in a lawsuit) alleging that the Work
|
||||
or a Contribution incorporated within the Work constitutes direct
|
||||
or contributory patent infringement, then any patent licenses
|
||||
granted to You under this License for that Work shall terminate
|
||||
as of the date such litigation is filed.
|
||||
|
||||
4. Redistribution. You may reproduce and distribute copies of the
|
||||
Work or Derivative Works thereof in any medium, with or without
|
||||
modifications, and in Source or Object form, provided that You
|
||||
meet the following conditions:
|
||||
|
||||
(a) You must give any other recipients of the Work or
|
||||
Derivative Works a copy of this License; and
|
||||
|
||||
(b) You must cause any modified files to carry prominent notices
|
||||
stating that You changed the files; and
|
||||
|
||||
(c) You must retain, in the Source form of any Derivative Works
|
||||
that You distribute, all copyright, patent, trademark, and
|
||||
attribution notices from the Source form of the Work,
|
||||
excluding those notices that do not pertain to any part of
|
||||
the Derivative Works; and
|
||||
|
||||
(d) If the Work includes a "NOTICE" text file as part of its
|
||||
distribution, then any Derivative Works that You distribute must
|
||||
include a readable copy of the attribution notices contained
|
||||
within such NOTICE file, excluding those notices that do not
|
||||
pertain to any part of the Derivative Works, in at least one
|
||||
of the following places: within a NOTICE text file distributed
|
||||
as part of the Derivative Works; within the Source form or
|
||||
documentation, if provided along with the Derivative Works; or,
|
||||
within a display generated by the Derivative Works, if and
|
||||
wherever such third-party notices normally appear. The contents
|
||||
of the NOTICE file are for informational purposes only and
|
||||
do not modify the License. You may add Your own attribution
|
||||
notices within Derivative Works that You distribute, alongside
|
||||
or as an addendum to the NOTICE text from the Work, provided
|
||||
that such additional attribution notices cannot be construed
|
||||
as modifying the License.
|
||||
|
||||
You may add Your own copyright statement to Your modifications and
|
||||
may provide additional or different license terms and conditions
|
||||
for use, reproduction, or distribution of Your modifications, or
|
||||
for any such Derivative Works as a whole, provided Your use,
|
||||
reproduction, and distribution of the Work otherwise complies with
|
||||
the conditions stated in this License.
|
||||
|
||||
5. Submission of Contributions. Unless You explicitly state otherwise,
|
||||
any Contribution intentionally submitted for inclusion in the Work
|
||||
by You to the Licensor shall be under the terms and conditions of
|
||||
this License, without any additional terms or conditions.
|
||||
Notwithstanding the above, nothing herein shall supersede or modify
|
||||
the terms of any separate license agreement you may have executed
|
||||
with Licensor regarding such Contributions.
|
||||
|
||||
6. Trademarks. This License does not grant permission to use the trade
|
||||
names, trademarks, service marks, or product names of the Licensor,
|
||||
except as required for reasonable and customary use in describing the
|
||||
origin of the Work and reproducing the content of the NOTICE file.
|
||||
|
||||
7. Disclaimer of Warranty. Unless required by applicable law or
|
||||
agreed to in writing, Licensor provides the Work (and each
|
||||
Contributor provides its Contributions) on an "AS IS" BASIS,
|
||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
|
||||
implied, including, without limitation, any warranties or conditions
|
||||
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
|
||||
PARTICULAR PURPOSE. You are solely responsible for determining the
|
||||
appropriateness of using or redistributing the Work and assume any
|
||||
risks associated with Your exercise of permissions under this License.
|
||||
|
||||
8. Limitation of Liability. In no event and under no legal theory,
|
||||
whether in tort (including negligence), contract, or otherwise,
|
||||
unless required by applicable law (such as deliberate and grossly
|
||||
negligent acts) or agreed to in writing, shall any Contributor be
|
||||
liable to You for damages, including any direct, indirect, special,
|
||||
incidental, or consequential damages of any character arising as a
|
||||
result of this License or out of the use or inability to use the
|
||||
Work (including but not limited to damages for loss of goodwill,
|
||||
work stoppage, computer failure or malfunction, or any and all
|
||||
other commercial damages or losses), even if such Contributor
|
||||
has been advised of the possibility of such damages.
|
||||
|
||||
9. Accepting Warranty or Additional Liability. While redistributing
|
||||
the Work or Derivative Works thereof, You may choose to offer,
|
||||
and charge a fee for, acceptance of support, warranty, indemnity,
|
||||
or other liability obligations and/or rights consistent with this
|
||||
License. However, in accepting such obligations, You may act only
|
||||
on Your own behalf and on Your sole responsibility, not on behalf
|
||||
of any other Contributor, and only if You agree to indemnify,
|
||||
defend, and hold each Contributor harmless for any liability
|
||||
incurred by, or claims asserted against, such Contributor by reason
|
||||
of your accepting any such warranty or additional liability.
|
||||
|
||||
END OF TERMS AND CONDITIONS
|
||||
|
||||
APPENDIX: How to apply the Apache License to your work.
|
||||
|
||||
To apply the Apache License to your work, attach the following
|
||||
boilerplate notice, with the fields enclosed by brackets "[]"
|
||||
replaced with your own identifying information. (Don't include
|
||||
the brackets!) The text should be enclosed in the appropriate
|
||||
comment syntax for the file format. We also recommend that a
|
||||
file or class name and description of purpose be included on the
|
||||
same "printed page" as the copyright notice for easier
|
||||
identification within third-party archives.
|
||||
|
||||
Copyright [yyyy] [name of copyright owner]
|
||||
|
||||
Licensed under the Apache License, Version 2.0 (the "License");
|
||||
you may not use this file except in compliance with the License.
|
||||
You may obtain a copy of the License at
|
||||
|
||||
http://www.apache.org/licenses/LICENSE-2.0
|
||||
|
||||
Unless required by applicable law or agreed to in writing, software
|
||||
distributed under the License is distributed on an "AS IS" BASIS,
|
||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
See the License for the specific language governing permissions and
|
||||
limitations under the License.
|
||||
61
vendor/github.com/gdamore/tcell/v2/README-wasm.md
generated
vendored
Normal file
61
vendor/github.com/gdamore/tcell/v2/README-wasm.md
generated
vendored
Normal file
@@ -0,0 +1,61 @@
|
||||
# WASM for _Tcell_
|
||||
|
||||
You can build _Tcell_ project into a webpage by compiling it slightly differently. This will result in a _Tcell_ project you can embed into another html page, or use as a standalone page.
|
||||
|
||||
## Building your project
|
||||
|
||||
WASM needs special build flags in order to work. You can build it by executing
|
||||
```sh
|
||||
GOOS=js GOARCH=wasm go build -o yourfile.wasm
|
||||
```
|
||||
|
||||
## Additional files
|
||||
|
||||
You also need 5 other files in the same directory as the wasm. Four (`tcell.html`, `tcell.js`, `termstyle.css`, and `beep.wav`) are provided in the `webfiles` directory. The last one, `wasm_exec.js`, can be copied from GOROOT into the current directory by executing
|
||||
```sh
|
||||
cp "$(go env GOROOT)/misc/wasm/wasm_exec.js" ./
|
||||
```
|
||||
|
||||
In `tcell.js`, you also need to change the constant
|
||||
```js
|
||||
const wasmFilePath = "yourfile.wasm"
|
||||
```
|
||||
to the file you outputted to when building.
|
||||
|
||||
## Displaying your project
|
||||
|
||||
### Standalone
|
||||
|
||||
You can see the project (with an white background around the terminal) by serving the directory. You can do this using any framework, including another golang project:
|
||||
|
||||
```golang
|
||||
// server.go
|
||||
|
||||
package main
|
||||
|
||||
import (
|
||||
"log"
|
||||
"net/http"
|
||||
)
|
||||
|
||||
func main() {
|
||||
log.Fatal(http.ListenAndServe(":8080",
|
||||
http.FileServer(http.Dir("/path/to/dir/to/serve")),
|
||||
))
|
||||
}
|
||||
|
||||
```
|
||||
|
||||
To see the webpage with this example, you can type in `localhost:8080/tcell.html` into your browser while `server.go` is running.
|
||||
|
||||
### Embedding
|
||||
It is recommended to use an iframe if you want to embed the app into a webpage:
|
||||
```html
|
||||
<iframe src="tcell.html" title="Tcell app"></iframe>
|
||||
```
|
||||
|
||||
## Other considerations
|
||||
|
||||
### Accessing files
|
||||
|
||||
`io.Open(filename)` and other related functions for reading file systems do not work; use `http.Get(filename)` instead.
|
||||
290
vendor/github.com/gdamore/tcell/v2/README.md
generated
vendored
Normal file
290
vendor/github.com/gdamore/tcell/v2/README.md
generated
vendored
Normal file
@@ -0,0 +1,290 @@
|
||||
<img src="logos/tcell.png" style="float: right"/>
|
||||
|
||||
# Tcell
|
||||
|
||||
_Tcell_ is a _Go_ package that provides a cell based view for text terminals, like _XTerm_.
|
||||
It was inspired by _termbox_, but includes many additional improvements.
|
||||
|
||||
[](https://stand-with-ukraine.pp.ua)
|
||||
[](https://github.com/gdamore/tcell/actions/workflows/linux.yml)
|
||||
[](https://github.com/gdamore/tcell/actions/workflows/windows.yml)
|
||||
[](https://github.com/gdamore/tcell/blob/master/LICENSE)
|
||||
[](https://pkg.go.dev/github.com/gdamore/tcell/v2)
|
||||
[](https://discord.gg/urTTxDN)
|
||||
[](https://codecov.io/gh/gdamore/tcell)
|
||||
[](https://goreportcard.com/report/github.com/gdamore/tcell/v2)
|
||||
|
||||
Please see [here](UKRAINE.md) for an important message for the people of Russia.
|
||||
|
||||
NOTE: This is version 2 of _Tcell_. There are breaking changes relative to version 1.
|
||||
Version 1.x remains available using the import `github.com/gdamore/tcell`.
|
||||
|
||||
## Tutorial
|
||||
|
||||
A brief, and still somewhat rough, [tutorial](TUTORIAL.md) is available.
|
||||
|
||||
## Examples
|
||||
|
||||
- [proxima5](https://github.com/gdamore/proxima5) - space shooter ([video](https://youtu.be/jNxKTCmY_bQ))
|
||||
- [govisor](https://github.com/gdamore/govisor) - service management UI ([screenshot](http://2.bp.blogspot.com/--OsvnfzSNow/Vf7aqMw3zXI/AAAAAAAAARo/uOMtOvw4Sbg/s1600/Screen%2BShot%2B2015-09-20%2Bat%2B9.08.41%2BAM.png))
|
||||
- mouse demo - included mouse test ([screenshot](http://2.bp.blogspot.com/-fWvW5opT0es/VhIdItdKqJI/AAAAAAAAATE/7Ojc0L1SpB0/s1600/Screen%2BShot%2B2015-10-04%2Bat%2B11.47.13%2BPM.png))
|
||||
- [gomatrix](https://github.com/gdamore/gomatrix) - converted from Termbox
|
||||
- [micro](https://github.com/zyedidia/micro/) - lightweight text editor with syntax-highlighting and themes
|
||||
- [godu](https://github.com/viktomas/godu) - utility to discover large files/folders
|
||||
- [tview](https://github.com/rivo/tview/) - rich interactive widgets
|
||||
- [cview](https://code.rocketnine.space/tslocum/cview) - user interface toolkit (fork of _tview_)
|
||||
- [awesome gocui](https://github.com/awesome-gocui/gocui) - Go Console User Interface
|
||||
- [gomandelbrot](https://github.com/rgm3/gomandelbrot) - Mandelbrot!
|
||||
- [WTF](https://github.com/senorprogrammer/wtf) - personal information dashboard
|
||||
- [browsh](https://github.com/browsh-org/browsh) - modern web browser ([video](https://www.youtube.com/watch?v=HZq86XfBoRo))
|
||||
- [go-life](https://github.com/sachaos/go-life) - Conway's Game of Life
|
||||
- [gowid](https://github.com/gcla/gowid) - compositional widgets for terminal UIs, inspired by _urwid_
|
||||
- [termshark](https://termshark.io) - interface for _tshark_, inspired by Wireshark, built on _gowid_
|
||||
- [go-tetris](https://github.com/MichaelS11/go-tetris) - Go Tetris with AI option
|
||||
- [fzf](https://github.com/junegunn/fzf) - command-line fuzzy finder
|
||||
- [ascii-fluid](https://github.com/esimov/ascii-fluid) - fluid simulation controlled by webcam
|
||||
- [cbind](https://code.rocketnine.space/tslocum/cbind) - key event encoding, decoding and handling
|
||||
- [tpong](https://github.com/spinzed/tpong) - old-school Pong
|
||||
- [aerc](https://git.sr.ht/~sircmpwn/aerc) - email client
|
||||
- [tblogs](https://github.com/ezeoleaf/tblogs) - development blogs reader
|
||||
- [spinc](https://github.com/lallassu/spinc) - _irssi_ inspired chat application for Cisco Spark/WebEx
|
||||
- [gorss](https://github.com/lallassu/gorss) - RSS/Atom feed reader
|
||||
- [memoryalike](https://github.com/Bios-Marcel/memoryalike) - memorization game
|
||||
- [lf](https://github.com/gokcehan/lf) - file manager
|
||||
- [goful](https://github.com/anmitsu/goful) - CUI file manager
|
||||
- [gokeybr](https://github.com/bunyk/gokeybr) - deliberately practice your typing
|
||||
- [gonano](https://github.com/jbaramidze/gonano) - editor, mimics _nano_
|
||||
- [uchess](https://github.com/tmountain/uchess) - UCI chess client
|
||||
- [min](https://github.com/a-h/min) - Gemini browser
|
||||
- [ov](https://github.com/noborus/ov) - file pager
|
||||
- [tmux-wormhole](https://github.com/gcla/tmux-wormhole) - _tmux_ plugin to transfer files
|
||||
- [gruid-tcell](https://github.com/anaseto/gruid-tcell) - driver for the grid based UI and game framework
|
||||
- [aretext](https://github.com/aretext/aretext) - minimalist text editor with _vim_ key bindings
|
||||
- [sync](https://github.com/kyprifog/sync) - GitHub repo synchronization tool
|
||||
- [statusbar](https://github.com/kyprifog/statusbar) - statusbar motivation tool for tracking periodic tasks/goals
|
||||
- [todo](https://github.com/kyprifog/todo) - simple todo app
|
||||
- [gosnakego](https://github.com/liweiyi88/gosnakego) - a snake game
|
||||
- [gbb](https://github.com/sdemingo/gbb) - A classical bulletin board app for tildes or public unix servers
|
||||
- [lil](https://github.com/andrievsky/lil) - A simple and flexible interface for any service by implementing only list and get operations
|
||||
- [hero.go](https://github.com/barisbll/hero.go) - 2d monster shooter ([video](https://user-images.githubusercontent.com/40062673/277157369-240d7606-b471-4aa1-8c54-4379a513122b.mp4))
|
||||
- [go-tetris](https://github.com/aaronriekenberg/go-tetris) - simple tetris game for native terminal and WASM using github actions+pages
|
||||
- [oddshub](https://github.com/dos-2/oddshub) - A TUI designed for analyzing sports betting odds
|
||||
|
||||
## Pure Go Terminfo Database
|
||||
|
||||
_Tcell_ includes a full parser and expander for terminfo capability strings,
|
||||
so that it can avoid hard coding escape strings for formatting. It also favors
|
||||
portability, and includes support for all POSIX systems.
|
||||
|
||||
The database is also flexible & extensible, and can be modified by either running
|
||||
a program to build the entire database, or an entry for just a single terminal.
|
||||
|
||||
## More Portable
|
||||
|
||||
_Tcell_ is portable to a wide variety of systems, and is pure Go, without
|
||||
any need for CGO.
|
||||
_Tcell_ is believed to work with mainstream systems officially supported by golang.
|
||||
|
||||
## No Async IO
|
||||
|
||||
_Tcell_ is able to operate without requiring `SIGIO` signals (unlike _termbox_),
|
||||
or asynchronous I/O, and can instead use standard Go file objects and Go routines.
|
||||
This means it should be safe, especially for
|
||||
use with programs that use exec, or otherwise need to manipulate the tty streams.
|
||||
This model is also much closer to idiomatic Go, leading to fewer surprises.
|
||||
|
||||
## Rich Unicode & non-Unicode support
|
||||
|
||||
_Tcell_ includes enhanced support for Unicode, including wide characters and
|
||||
combining characters, provided your terminal can support them.
|
||||
Note that
|
||||
Windows terminals generally don't support the full Unicode repertoire.
|
||||
|
||||
It will also convert to and from Unicode locales, so that the program
|
||||
can work with UTF-8 internally, and get reasonable output in other locales.
|
||||
_Tcell_ tries hard to convert to native characters on both input and output.
|
||||
On output _Tcell_ even makes use of the alternate character set to facilitate
|
||||
drawing certain characters.
|
||||
|
||||
## More Function Keys
|
||||
|
||||
_Tcell_ also has richer support for a larger number of special keys that some
|
||||
terminals can send.
|
||||
|
||||
## Better Color Handling
|
||||
|
||||
_Tcell_ will respect your terminal's color space as specified within your terminfo entries.
|
||||
For example attempts to emit color sequences on VT100 terminals
|
||||
won't result in unintended consequences.
|
||||
|
||||
In legacy Windows mode, _Tcell_ supports 16 colors, bold, dim, and reverse,
|
||||
instead of just termbox's 8 colors with reverse. (Note that there is some
|
||||
conflation with bold/dim and colors.)
|
||||
Modern Windows 10 can benefit from much richer colors however.
|
||||
|
||||
_Tcell_ maps 16 colors down to 8, for terminals that need it.
|
||||
(The upper 8 colors are just brighter versions of the lower 8.)
|
||||
|
||||
## Better Mouse Support
|
||||
|
||||
_Tcell_ supports enhanced mouse tracking mode, so your application can receive
|
||||
regular mouse motion events, and wheel events, if your terminal supports it.
|
||||
|
||||
(Note: The Windows 10 Terminal application suffers from a flaw in this regard,
|
||||
and does not support mouse interaction. The stock Windows 10 console host
|
||||
fired up with cmd.exe or PowerShell works fine however.)
|
||||
|
||||
## _Termbox_ Compatibility
|
||||
|
||||
A compatibility layer for _termbox_ is provided in the `compat` directory.
|
||||
To use it, try importing `github.com/gdamore/tcell/termbox` instead.
|
||||
Most _termbox-go_ programs will probably work without further modification.
|
||||
|
||||
## Working With Unicode
|
||||
|
||||
Internally _Tcell_ uses UTF-8, just like Go.
|
||||
However, _Tcell_ understands how to
|
||||
convert to and from other character sets, using the capabilities of
|
||||
the `golang.org/x/text/encoding` packages.
|
||||
Your application must supply
|
||||
them, as the full set of the most common ones bloats the program by about 2 MB.
|
||||
If you're lazy, and want them all anyway, see the `encoding` sub-directory.
|
||||
|
||||
## Wide & Combining Characters
|
||||
|
||||
The `SetContent()` API takes a primary rune, and an optional list of combining runes.
|
||||
If any of the runes is a wide (East Asian) rune occupying two cells,
|
||||
then the library will skip output from the following cell. Care must be
|
||||
taken in the application to avoid explicitly attempting to set content in the
|
||||
next cell, otherwise the results are undefined. (Normally the wide character
|
||||
is displayed, and the other character is not; do not depend on that behavior.)
|
||||
|
||||
Older terminal applications (especially on systems like Windows 8) lack support
|
||||
for advanced Unicode, and thus may not fare well.
|
||||
|
||||
## Colors
|
||||
|
||||
_Tcell_ assumes the ANSI/XTerm color model, including the 256 color map that
|
||||
XTerm uses when it supports 256 colors. The terminfo guidance will be
|
||||
honored, with respect to the number of colors supported. Also, only
|
||||
terminals which expose ANSI style `setaf` and `setab` will support color;
|
||||
if you have a color terminal that only has `setf` and `setb`, please submit
|
||||
a ticket.
|
||||
|
||||
## 24-bit Color
|
||||
|
||||
_Tcell_ _supports 24-bit color!_ (That is, if your terminal can support it.)
|
||||
|
||||
NOTE: Technically the approach of using 24-bit RGB values for color is more
|
||||
accurately described as "direct color", but most people use the term "true color".
|
||||
We follow the (inaccurate) common convention.
|
||||
|
||||
There are a few ways you can enable (or disable) true color.
|
||||
|
||||
- For many terminals, we can detect it automatically if your terminal
|
||||
includes the `RGB` or `Tc` capabilities (or rather it did when the database
|
||||
was updated.)
|
||||
|
||||
- You can force this one by setting the `COLORTERM` environment variable to
|
||||
`24-bit`, `truecolor` or `24bit`. This is the same method used
|
||||
by most other terminal applications that support 24-bit color.
|
||||
|
||||
- If you set your `TERM` environment variable to a value with the suffix `-truecolor`
|
||||
then 24-bit color compatible with XTerm and ECMA-48 will be assumed.
|
||||
(This feature is deprecated.
|
||||
It is recommended to use one of other methods listed above.)
|
||||
|
||||
- You can disable 24-bit color by setting `TCELL_TRUECOLOR=disable` in your
|
||||
environment.
|
||||
|
||||
When using TrueColor, programs will display the colors that the programmer
|
||||
intended, overriding any "`themes`" you may have set in your terminal
|
||||
emulator. (For some cases, accurate color fidelity is more important
|
||||
than respecting themes. For other cases, such as typical text apps that
|
||||
only use a few colors, its more desirable to respect the themes that
|
||||
the user has established.)
|
||||
|
||||
## Performance
|
||||
|
||||
Reasonable attempts have been made to minimize sending data to terminals,
|
||||
avoiding repeated sequences or drawing the same cell on refresh updates.
|
||||
|
||||
## Terminfo
|
||||
|
||||
(Not relevant for Windows users.)
|
||||
|
||||
The Terminfo implementation operates with a built-in database.
|
||||
This should satisfy most users. However, it can also (on systems
|
||||
with ncurses installed), dynamically parse the output from `infocmp`
|
||||
for terminals it does not already know about.
|
||||
|
||||
See the `terminfo/` directory for more information about generating
|
||||
new entries for the built-in database.
|
||||
|
||||
_Tcell_ requires that the terminal support the `cup` mode of cursor addressing.
|
||||
Ancient terminals without the ability to position the cursor directly
|
||||
are not supported.
|
||||
This is unlikely to be a problem; such terminals have not been mass-produced
|
||||
since the early 1970s.
|
||||
|
||||
## Mouse Support
|
||||
|
||||
Mouse support is detected via the `kmous` terminfo variable, however,
|
||||
enablement/disablement and decoding mouse events is done using hard coded
|
||||
sequences based on the XTerm X11 model. All popular
|
||||
terminals with mouse tracking support this model. (Full terminfo support
|
||||
is not possible as terminfo sequences are not defined.)
|
||||
|
||||
On Windows, the mouse works normally.
|
||||
|
||||
Mouse wheel buttons on various terminals are known to work, but the support
|
||||
in terminal emulators, as well as support for various buttons and
|
||||
live mouse tracking, varies widely.
|
||||
Modern _xterm_, macOS _Terminal_, and _iTerm_ all work well.
|
||||
|
||||
## Bracketed Paste
|
||||
|
||||
Terminals that appear to support the XTerm mouse model also can support
|
||||
bracketed paste, for applications that opt-in. See `EnablePaste()` for details.
|
||||
|
||||
## Testability
|
||||
|
||||
There is a `SimulationScreen`, that can be used to simulate a real screen
|
||||
for automated testing. The supplied tests do this. The simulation contains
|
||||
event delivery, screen resizing support, and capabilities to inject events
|
||||
and examine "`physical`" screen contents.
|
||||
|
||||
## Platforms
|
||||
|
||||
### POSIX (Linux, FreeBSD, macOS, Solaris, etc.)
|
||||
|
||||
Everything works using pure Go on mainstream platforms. Some more esoteric
|
||||
platforms (e.g., AIX) may need to be added. Pull requests are welcome!
|
||||
|
||||
### Windows
|
||||
|
||||
Windows console mode applications are supported.
|
||||
|
||||
Modern console applications like ConEmu and the Windows 10 terminal,
|
||||
support all the good features (resize, mouse tracking, etc.)
|
||||
|
||||
### WASM
|
||||
|
||||
WASM is supported, but needs additional setup detailed in [README-wasm](README-wasm.md).
|
||||
|
||||
### Plan9 and others
|
||||
|
||||
These platforms won't work, but compilation stubs are supplied
|
||||
for folks that want to include parts of this in software for those
|
||||
platforms. The Simulation screen works, but as _Tcell_ doesn't know how to
|
||||
allocate a real screen object on those platforms, `NewScreen()` will fail.
|
||||
|
||||
If anyone has wisdom about how to improve support for these,
|
||||
please let me know. PRs are especially welcome.
|
||||
|
||||
### Commercial Support
|
||||
|
||||
_Tcell_ is absolutely free, but if you want to obtain commercial, professional support, there are options.
|
||||
|
||||
- [TideLift](https://tidelift.com/) subscriptions include support for _Tcell_, as well as many other open source packages.
|
||||
- [Staysail Systems Inc.](mailto:info@staysail.tech) offers direct support, and custom development around _Tcell_ on an hourly basis.
|
||||
15
vendor/github.com/gdamore/tcell/v2/SECURITY.md
generated
vendored
Normal file
15
vendor/github.com/gdamore/tcell/v2/SECURITY.md
generated
vendored
Normal file
@@ -0,0 +1,15 @@
|
||||
# SECURITY
|
||||
|
||||
It's somewhat unlikely that tcell is in a security sensitive path,
|
||||
but we do take security seriously.
|
||||
|
||||
## Vulnerabilityu Response
|
||||
|
||||
If you report a vulnerability, we will respond within 2 business days.
|
||||
|
||||
## Report a Vulnerability
|
||||
|
||||
If you wish to report a vulnerability found in tcell, simply send a message
|
||||
to garrett@damore.org. You may also reach us on our discord channel -
|
||||
https://discord.gg/urTTxDN - a private message to `gdamore` on that channel
|
||||
may be submitted instead of mail.
|
||||
313
vendor/github.com/gdamore/tcell/v2/TUTORIAL.md
generated
vendored
Normal file
313
vendor/github.com/gdamore/tcell/v2/TUTORIAL.md
generated
vendored
Normal file
@@ -0,0 +1,313 @@
|
||||
# _Tcell_ Tutorial
|
||||
|
||||
_Tcell_ provides a low-level, portable API for building terminal-based programs.
|
||||
A [terminal emulator](https://en.wikipedia.org/wiki/Terminal_emulator)
|
||||
(or a real terminal such as a DEC VT-220) is used to interact with such a program.
|
||||
|
||||
_Tcell_'s interface is fairly low-level.
|
||||
While it provides a reasonably portable way of dealing with all the usual terminal
|
||||
features, it may be easier to utilize a higher level framework.
|
||||
A number of such frameworks are listed on the _Tcell_ main [README](README.md).
|
||||
|
||||
This tutorial provides the details of _Tcell_, and is appropriate for developers
|
||||
wishing to create their own application frameworks or needing more direct access
|
||||
to the terminal capabilities.
|
||||
|
||||
## Resize events
|
||||
|
||||
Applications receive an event of type `EventResize` when they are first initialized and each time the terminal is resized.
|
||||
The new size is available as `Size`.
|
||||
|
||||
```go
|
||||
switch ev := ev.(type) {
|
||||
case *tcell.EventResize:
|
||||
w, h := ev.Size()
|
||||
logMessage(fmt.Sprintf("Resized to %dx%d", w, h))
|
||||
}
|
||||
```
|
||||
|
||||
## Key events
|
||||
|
||||
When a key is pressed, applications receive an event of type `EventKey`.
|
||||
This event describes the modifier keys pressed (if any) and the pressed key or rune.
|
||||
|
||||
When a rune key is pressed, an event with its `Key` set to `KeyRune` is dispatched.
|
||||
|
||||
When a non-rune key is pressed, it is available as the `Key` of the event.
|
||||
|
||||
```go
|
||||
switch ev := ev.(type) {
|
||||
case *tcell.EventKey:
|
||||
mod, key, ch := ev.Mod(), ev.Key(), ev.Rune()
|
||||
logMessage(fmt.Sprintf("EventKey Modifiers: %d Key: %d Rune: %d", mod, key, ch))
|
||||
}
|
||||
```
|
||||
|
||||
### Key event restrictions
|
||||
|
||||
Terminal-based programs have less visibility into keyboard activity than graphical applications.
|
||||
|
||||
When a key is pressed and held, additional key press events are sent by the terminal emulator.
|
||||
The rate of these repeated events depends on the emulator's configuration.
|
||||
Key release events are not available.
|
||||
|
||||
It is not possible to distinguish runes typed while holding shift and runes typed using caps lock.
|
||||
Capital letters are reported without the Shift modifier.
|
||||
|
||||
## Mouse events
|
||||
|
||||
Applications receive an event of type `EventMouse` when the mouse moves, or a mouse button is pressed or released.
|
||||
Mouse events are only delivered if
|
||||
`EnableMouse` has been called.
|
||||
|
||||
The mouse buttons being pressed (if any) are available as `Buttons`, and the position of the mouse is available as `Position`.
|
||||
|
||||
```go
|
||||
switch ev := ev.(type) {
|
||||
case *tcell.EventMouse:
|
||||
mod := ev.Modifiers()
|
||||
btns := ev.Buttons()
|
||||
x, y := ev.Position()
|
||||
logMessage(fmt.Sprintf("EventMouse Modifiers: %d Buttons: %d Position: %d,%d", mod, btns, x, y))
|
||||
}
|
||||
```
|
||||
|
||||
### Mouse buttons
|
||||
|
||||
Identifier | Alias | Description
|
||||
-----------|-----------------|-----------
|
||||
Button1 | ButtonPrimary | Left button
|
||||
Button2 | ButtonSecondary | Right button
|
||||
Button3 | ButtonMiddle | Middle button
|
||||
Button4 | | Side button (thumb/next)
|
||||
Button5 | | Side button (thumb/prev)
|
||||
WheelUp | | Scroll wheel up
|
||||
WheelDown | | Scroll wheel down
|
||||
WheelLeft | | Horizontal wheel left
|
||||
WheelRight | | Horizontal wheel right
|
||||
|
||||
## Usage
|
||||
|
||||
To create a _Tcell_ application, first initialize a screen to hold it.
|
||||
|
||||
```go
|
||||
s, err := tcell.NewScreen()
|
||||
if err != nil {
|
||||
log.Fatalf("%+v", err)
|
||||
}
|
||||
if err := s.Init(); err != nil {
|
||||
log.Fatalf("%+v", err)
|
||||
}
|
||||
|
||||
// Set default text style
|
||||
defStyle := tcell.StyleDefault.Background(tcell.ColorReset).Foreground(tcell.ColorReset)
|
||||
s.SetStyle(defStyle)
|
||||
|
||||
// Clear screen
|
||||
s.Clear()
|
||||
```
|
||||
|
||||
Text may be drawn on the screen using `SetContent`.
|
||||
|
||||
```go
|
||||
s.SetContent(0, 0, 'H', nil, defStyle)
|
||||
s.SetContent(1, 0, 'i', nil, defStyle)
|
||||
s.SetContent(2, 0, '!', nil, defStyle)
|
||||
```
|
||||
|
||||
To draw text more easily, define a render function.
|
||||
|
||||
```go
|
||||
func drawText(s tcell.Screen, x1, y1, x2, y2 int, style tcell.Style, text string) {
|
||||
row := y1
|
||||
col := x1
|
||||
for _, r := range []rune(text) {
|
||||
s.SetContent(col, row, r, nil, style)
|
||||
col++
|
||||
if col >= x2 {
|
||||
row++
|
||||
col = x1
|
||||
}
|
||||
if row > y2 {
|
||||
break
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
Lastly, define an event loop to handle user input and update application state.
|
||||
|
||||
```go
|
||||
quit := func() {
|
||||
s.Fini()
|
||||
os.Exit(0)
|
||||
}
|
||||
for {
|
||||
// Update screen
|
||||
s.Show()
|
||||
|
||||
// Poll event
|
||||
ev := s.PollEvent()
|
||||
|
||||
// Process event
|
||||
switch ev := ev.(type) {
|
||||
case *tcell.EventResize:
|
||||
s.Sync()
|
||||
case *tcell.EventKey:
|
||||
if ev.Key() == tcell.KeyEscape || ev.Key() == tcell.KeyCtrlC {
|
||||
quit()
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Demo application
|
||||
|
||||
The following demonstrates how to initialize a screen, draw text/graphics and handle user input.
|
||||
|
||||
```go
|
||||
package main
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"log"
|
||||
|
||||
"github.com/gdamore/tcell/v2"
|
||||
)
|
||||
|
||||
func drawText(s tcell.Screen, x1, y1, x2, y2 int, style tcell.Style, text string) {
|
||||
row := y1
|
||||
col := x1
|
||||
for _, r := range []rune(text) {
|
||||
s.SetContent(col, row, r, nil, style)
|
||||
col++
|
||||
if col >= x2 {
|
||||
row++
|
||||
col = x1
|
||||
}
|
||||
if row > y2 {
|
||||
break
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
func drawBox(s tcell.Screen, x1, y1, x2, y2 int, style tcell.Style, text string) {
|
||||
if y2 < y1 {
|
||||
y1, y2 = y2, y1
|
||||
}
|
||||
if x2 < x1 {
|
||||
x1, x2 = x2, x1
|
||||
}
|
||||
|
||||
// Fill background
|
||||
for row := y1; row <= y2; row++ {
|
||||
for col := x1; col <= x2; col++ {
|
||||
s.SetContent(col, row, ' ', nil, style)
|
||||
}
|
||||
}
|
||||
|
||||
// Draw borders
|
||||
for col := x1; col <= x2; col++ {
|
||||
s.SetContent(col, y1, tcell.RuneHLine, nil, style)
|
||||
s.SetContent(col, y2, tcell.RuneHLine, nil, style)
|
||||
}
|
||||
for row := y1 + 1; row < y2; row++ {
|
||||
s.SetContent(x1, row, tcell.RuneVLine, nil, style)
|
||||
s.SetContent(x2, row, tcell.RuneVLine, nil, style)
|
||||
}
|
||||
|
||||
// Only draw corners if necessary
|
||||
if y1 != y2 && x1 != x2 {
|
||||
s.SetContent(x1, y1, tcell.RuneULCorner, nil, style)
|
||||
s.SetContent(x2, y1, tcell.RuneURCorner, nil, style)
|
||||
s.SetContent(x1, y2, tcell.RuneLLCorner, nil, style)
|
||||
s.SetContent(x2, y2, tcell.RuneLRCorner, nil, style)
|
||||
}
|
||||
|
||||
drawText(s, x1+1, y1+1, x2-1, y2-1, style, text)
|
||||
}
|
||||
|
||||
func main() {
|
||||
defStyle := tcell.StyleDefault.Background(tcell.ColorReset).Foreground(tcell.ColorReset)
|
||||
boxStyle := tcell.StyleDefault.Foreground(tcell.ColorWhite).Background(tcell.ColorPurple)
|
||||
|
||||
// Initialize screen
|
||||
s, err := tcell.NewScreen()
|
||||
if err != nil {
|
||||
log.Fatalf("%+v", err)
|
||||
}
|
||||
if err := s.Init(); err != nil {
|
||||
log.Fatalf("%+v", err)
|
||||
}
|
||||
s.SetStyle(defStyle)
|
||||
s.EnableMouse()
|
||||
s.EnablePaste()
|
||||
s.Clear()
|
||||
|
||||
// Draw initial boxes
|
||||
drawBox(s, 1, 1, 42, 7, boxStyle, "Click and drag to draw a box")
|
||||
drawBox(s, 5, 9, 32, 14, boxStyle, "Press C to reset")
|
||||
|
||||
quit := func() {
|
||||
// You have to catch panics in a defer, clean up, and
|
||||
// re-raise them - otherwise your application can
|
||||
// die without leaving any diagnostic trace.
|
||||
maybePanic := recover()
|
||||
s.Fini()
|
||||
if maybePanic != nil {
|
||||
panic(maybePanic)
|
||||
}
|
||||
}
|
||||
defer quit()
|
||||
|
||||
// Here's how to get the screen size when you need it.
|
||||
// xmax, ymax := s.Size()
|
||||
|
||||
// Here's an example of how to inject a keystroke where it will
|
||||
// be picked up by the next PollEvent call. Note that the
|
||||
// queue is LIFO, it has a limited length, and PostEvent() can
|
||||
// return an error.
|
||||
// s.PostEvent(tcell.NewEventKey(tcell.KeyRune, rune('a'), 0))
|
||||
|
||||
// Event loop
|
||||
ox, oy := -1, -1
|
||||
for {
|
||||
// Update screen
|
||||
s.Show()
|
||||
|
||||
// Poll event
|
||||
ev := s.PollEvent()
|
||||
|
||||
// Process event
|
||||
switch ev := ev.(type) {
|
||||
case *tcell.EventResize:
|
||||
s.Sync()
|
||||
case *tcell.EventKey:
|
||||
if ev.Key() == tcell.KeyEscape || ev.Key() == tcell.KeyCtrlC {
|
||||
return
|
||||
} else if ev.Key() == tcell.KeyCtrlL {
|
||||
s.Sync()
|
||||
} else if ev.Rune() == 'C' || ev.Rune() == 'c' {
|
||||
s.Clear()
|
||||
}
|
||||
case *tcell.EventMouse:
|
||||
x, y := ev.Position()
|
||||
|
||||
switch ev.Buttons() {
|
||||
case tcell.Button1, tcell.Button2:
|
||||
if ox < 0 {
|
||||
ox, oy = x, y // record location when click started
|
||||
}
|
||||
|
||||
case tcell.ButtonNone:
|
||||
if ox >= 0 {
|
||||
label := fmt.Sprintf("%d,%d to %d,%d", ox, oy, x, y)
|
||||
drawBox(s, ox, oy, x, y, boxStyle, label)
|
||||
ox, oy = -1, -1
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
77
vendor/github.com/gdamore/tcell/v2/UKRAINE.md
generated
vendored
Normal file
77
vendor/github.com/gdamore/tcell/v2/UKRAINE.md
generated
vendored
Normal file
@@ -0,0 +1,77 @@
|
||||
# Ukraine, Russia, and a World Tragedy
|
||||
|
||||
## A message to those inside Russia
|
||||
|
||||
### Written March 4, 2022.
|
||||
|
||||
It is with a very heavy heart that I write this. I am normally opposed to the use of open source
|
||||
projects to communicate political positions or advocate for things outside the immediate relevancy
|
||||
to that project.
|
||||
|
||||
However, the events occurring in Ukraine, and specifically the unprecedented invasion of Ukraine by
|
||||
Russian forces operating under orders from Russian President Vladimir Putin compel me to speak out.
|
||||
|
||||
Those who know me, know that I have family, friends, and colleagues in Russia, and Ukraine both. My closest friends
|
||||
have historically been Russian friends my wife's hometown of Chelyabinsk. I myself have in the past
|
||||
frequently traveled to Russia, and indeed operated a software development firm with offices in St. Petersburg.
|
||||
I had a special kinship with Russia and its people.
|
||||
|
||||
I say "had", because I fear that the actions of Putin, and the massive disinformation campaign that his regime
|
||||
has waged inside Russia, mean that it's likely that I won't see those friends again. At present, I'm not sure
|
||||
my wife will see her own mother again. We no longer feel it's safe for either of us to return Russia given
|
||||
actions taken by the regime to crack down on those who express disagreement.
|
||||
|
||||
Russian citizens are being led to believe it is acting purely defensively, and that only legitimate military
|
||||
targets are being targeted, and that all the information we have received in the West are fakes.
|
||||
|
||||
I am confident that nothing could be further from the truth.
|
||||
|
||||
This has caused many in Russia, including people whom I respect and believe to be smarter than this, to
|
||||
stand by Putin, and endorse his actions. The claim is that the entirety of NATO is operating at the behest
|
||||
of the USA, and that the entirety of Europe was poised to attack Russia. While this is clearly absurd to those
|
||||
of us with any understanding of western politics, Russian citizens are being fed this lie, and believing it.
|
||||
|
||||
If you're reading this from inside Russia -- YOU are the person that I hope this message reaches. Your
|
||||
government is LYING to you. Of course, all governments lie all the time. But consider this. Almost the
|
||||
entire world has condemned the invasion of Ukraine as criminal, and has applied sanctions. Even countries
|
||||
which have poor relations with the US sanctioning Russia, as well as nations which historically have remained
|
||||
neutral. (Famously neutral -- even during World War II, Switzerland has acted to apply sanctions in
|
||||
concert with the rest of the world.)
|
||||
|
||||
Ask yourself, why does Putin fear a free press so much, if what he says is true? Why the crack-downs on
|
||||
children expressing only a desire for peace with Ukraine? Why would the entire world unified against him,
|
||||
if Putin was in the right? Why would the only countries that stood with Russia against
|
||||
the UN resolution to condemn these acts as crimes be Belarus, North Korea, and Syria? Even countries normally
|
||||
allied to Russia could not bring themselves to do more than abstain from the vote to condemn it.
|
||||
|
||||
To be clear, I do not claim that the actions taken by the West or by the Ukrainian government were completely
|
||||
blameless. On the contrary, I understand that Western media is biased, and the truth is rarely exactly
|
||||
as reported. I believe that there is a kernel of truth in the claims of fascists and ultra-nationalist
|
||||
militias operating in Ukraine and specifically Donbas. However, I am also equally certain that Putin's
|
||||
response is out of proportion, and that concerns about such militias are principally just a pretext to justify
|
||||
an invasion.
|
||||
|
||||
Europe is at war, unlike we've seen in my lifetime. The world is more divided, and closer to nuclear holocaust
|
||||
than it has been since the Cold War. And that is 100% the fault of Putin.
|
||||
|
||||
While Putin remains in power, there cannot really be any way for Russian international relations to return
|
||||
to normal. Putin has set your country on a path to return to the Cold War, likely because he fancies himself
|
||||
to be a new Stalin. However, unlike the Soviet Union, the Russian economy does not have the wherewithal to
|
||||
stand on its own, and the invasion of Ukraine has fully ensured that Russia will not find any friends anywhere
|
||||
else in Europe, and probably few places in Asia.
|
||||
|
||||
The *only* paths forward for Russia are either a Russia without Putin (and those who would support his agenda),
|
||||
or a complete breakdown of Russian prosperity, likely followed by the increasing international conflict that will
|
||||
be the natural escalation from a country that is isolated and impoverished. Those of us observing from the West are
|
||||
gravely concerned, because we cannot see any end to this madness that does not result in nuclear conflict,
|
||||
unless from within.
|
||||
|
||||
In the meantime, the worst prices will be paid for by innocents in Ukraine, and by young Russian mean
|
||||
forced to carry out the orders of Putin's corrupt regime.
|
||||
|
||||
And *that* is why I write this -- to appeal to those within Russia to open your eyes, and think with
|
||||
your minds. It is right and proper to be proud of your country and its rich heritage. But it is also
|
||||
right and proper to look for ways to save it from the ruinous path that its current leadership has set it upon,
|
||||
and to recognize when that leadership is no longer acting in interest of the country or its people.
|
||||
|
||||
- Garrett D'Amore, March 4, 2022
|
||||
34
vendor/github.com/gdamore/tcell/v2/attr.go
generated
vendored
Normal file
34
vendor/github.com/gdamore/tcell/v2/attr.go
generated
vendored
Normal file
@@ -0,0 +1,34 @@
|
||||
// Copyright 2024 The TCell Authors
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 (the "License");
|
||||
// you may not use file except in compliance with the License.
|
||||
// You may obtain a copy of the license at
|
||||
//
|
||||
// http://www.apache.org/licenses/LICENSE-2.0
|
||||
//
|
||||
// Unless required by applicable law or agreed to in writing, software
|
||||
// distributed under the License is distributed on an "AS IS" BASIS,
|
||||
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
// See the License for the specific language governing permissions and
|
||||
// limitations under the License.
|
||||
|
||||
package tcell
|
||||
|
||||
// AttrMask represents a mask of text attributes, apart from color.
|
||||
// Note that support for attributes may vary widely across terminals.
|
||||
type AttrMask uint
|
||||
|
||||
// Attributes are not colors, but affect the display of text. They can
|
||||
// be combined, in some cases, but not others. (E.g. you can have Dim Italic,
|
||||
// but only CurlyUnderline cannot be mixed with DottedUnderline.)
|
||||
const (
|
||||
AttrBold AttrMask = 1 << iota
|
||||
AttrBlink
|
||||
AttrReverse
|
||||
AttrUnderline // Deprecated: Use UnderlineStyle
|
||||
AttrDim
|
||||
AttrItalic
|
||||
AttrStrikeThrough
|
||||
AttrInvalid AttrMask = 1 << 31 // Mark the style or attributes invalid
|
||||
AttrNone AttrMask = 0 // Just normal text.
|
||||
)
|
||||
249
vendor/github.com/gdamore/tcell/v2/cell.go
generated
vendored
Normal file
249
vendor/github.com/gdamore/tcell/v2/cell.go
generated
vendored
Normal file
@@ -0,0 +1,249 @@
|
||||
// Copyright 2024 The TCell Authors
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 (the "License");
|
||||
// you may not use file except in compliance with the License.
|
||||
// You may obtain a copy of the license at
|
||||
//
|
||||
// http://www.apache.org/licenses/LICENSE-2.0
|
||||
//
|
||||
// Unless required by applicable law or agreed to in writing, software
|
||||
// distributed under the License is distributed on an "AS IS" BASIS,
|
||||
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
// See the License for the specific language governing permissions and
|
||||
// limitations under the License.
|
||||
|
||||
package tcell
|
||||
|
||||
import (
|
||||
"os"
|
||||
"reflect"
|
||||
|
||||
runewidth "github.com/mattn/go-runewidth"
|
||||
)
|
||||
|
||||
type cell struct {
|
||||
currMain rune
|
||||
currComb []rune
|
||||
currStyle Style
|
||||
lastMain rune
|
||||
lastStyle Style
|
||||
lastComb []rune
|
||||
width int
|
||||
lock bool
|
||||
}
|
||||
|
||||
// CellBuffer represents a two-dimensional array of character cells.
|
||||
// This is primarily intended for use by Screen implementors; it
|
||||
// contains much of the common code they need. To create one, just
|
||||
// declare a variable of its type; no explicit initialization is necessary.
|
||||
//
|
||||
// CellBuffer is not thread safe.
|
||||
type CellBuffer struct {
|
||||
w int
|
||||
h int
|
||||
cells []cell
|
||||
}
|
||||
|
||||
// SetContent sets the contents (primary rune, combining runes,
|
||||
// and style) for a cell at a given location. If the background or
|
||||
// foreground of the style is set to ColorNone, then the respective
|
||||
// color is left un changed.
|
||||
func (cb *CellBuffer) SetContent(x int, y int,
|
||||
mainc rune, combc []rune, style Style,
|
||||
) {
|
||||
if x >= 0 && y >= 0 && x < cb.w && y < cb.h {
|
||||
c := &cb.cells[(y*cb.w)+x]
|
||||
|
||||
// Wide characters: we want to mark the "wide" cells
|
||||
// dirty as well as the base cell, to make sure we consider
|
||||
// both cells as dirty together. We only need to do this
|
||||
// if we're changing content
|
||||
if (c.width > 0) && (mainc != c.currMain || len(combc) != len(c.currComb) || (len(combc) > 0 && !reflect.DeepEqual(combc, c.currComb))) {
|
||||
for i := 0; i < c.width; i++ {
|
||||
cb.SetDirty(x+i, y, true)
|
||||
}
|
||||
}
|
||||
|
||||
c.currComb = append([]rune{}, combc...)
|
||||
|
||||
if c.currMain != mainc {
|
||||
c.width = runewidth.RuneWidth(mainc)
|
||||
}
|
||||
c.currMain = mainc
|
||||
if style.fg == ColorNone {
|
||||
style.fg = c.currStyle.fg
|
||||
}
|
||||
if style.bg == ColorNone {
|
||||
style.bg = c.currStyle.bg
|
||||
}
|
||||
c.currStyle = style
|
||||
}
|
||||
}
|
||||
|
||||
// GetContent returns the contents of a character cell, including the
|
||||
// primary rune, any combining character runes (which will usually be
|
||||
// nil), the style, and the display width in cells. (The width can be
|
||||
// either 1, normally, or 2 for East Asian full-width characters.)
|
||||
func (cb *CellBuffer) GetContent(x, y int) (rune, []rune, Style, int) {
|
||||
var mainc rune
|
||||
var combc []rune
|
||||
var style Style
|
||||
var width int
|
||||
if x >= 0 && y >= 0 && x < cb.w && y < cb.h {
|
||||
c := &cb.cells[(y*cb.w)+x]
|
||||
mainc, combc, style = c.currMain, c.currComb, c.currStyle
|
||||
if width = c.width; width == 0 || mainc < ' ' {
|
||||
width = 1
|
||||
mainc = ' '
|
||||
}
|
||||
}
|
||||
return mainc, combc, style, width
|
||||
}
|
||||
|
||||
// Size returns the (width, height) in cells of the buffer.
|
||||
func (cb *CellBuffer) Size() (int, int) {
|
||||
return cb.w, cb.h
|
||||
}
|
||||
|
||||
// Invalidate marks all characters within the buffer as dirty.
|
||||
func (cb *CellBuffer) Invalidate() {
|
||||
for i := range cb.cells {
|
||||
cb.cells[i].lastMain = rune(0)
|
||||
}
|
||||
}
|
||||
|
||||
// Dirty checks if a character at the given location needs to be
|
||||
// refreshed on the physical display. This returns true if the cell
|
||||
// content is different since the last time it was marked clean.
|
||||
func (cb *CellBuffer) Dirty(x, y int) bool {
|
||||
if x >= 0 && y >= 0 && x < cb.w && y < cb.h {
|
||||
c := &cb.cells[(y*cb.w)+x]
|
||||
if c.lock {
|
||||
return false
|
||||
}
|
||||
if c.lastMain == rune(0) {
|
||||
return true
|
||||
}
|
||||
if c.lastMain != c.currMain {
|
||||
return true
|
||||
}
|
||||
if c.lastStyle != c.currStyle {
|
||||
return true
|
||||
}
|
||||
if len(c.lastComb) != len(c.currComb) {
|
||||
return true
|
||||
}
|
||||
for i := range c.lastComb {
|
||||
if c.lastComb[i] != c.currComb[i] {
|
||||
return true
|
||||
}
|
||||
}
|
||||
}
|
||||
return false
|
||||
}
|
||||
|
||||
// SetDirty is normally used to indicate that a cell has
|
||||
// been displayed (in which case dirty is false), or to manually
|
||||
// force a cell to be marked dirty.
|
||||
func (cb *CellBuffer) SetDirty(x, y int, dirty bool) {
|
||||
if x >= 0 && y >= 0 && x < cb.w && y < cb.h {
|
||||
c := &cb.cells[(y*cb.w)+x]
|
||||
if dirty {
|
||||
c.lastMain = rune(0)
|
||||
} else {
|
||||
if c.currMain == rune(0) {
|
||||
c.currMain = ' '
|
||||
}
|
||||
c.lastMain = c.currMain
|
||||
c.lastComb = c.currComb
|
||||
c.lastStyle = c.currStyle
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// LockCell locks a cell from being drawn, effectively marking it "clean" until
|
||||
// the lock is removed. This can be used to prevent tcell from drawing a given
|
||||
// cell, even if the underlying content has changed. For example, when drawing a
|
||||
// sixel graphic directly to a TTY screen an implementer must lock the region
|
||||
// underneath the graphic to prevent tcell from drawing on top of the graphic.
|
||||
func (cb *CellBuffer) LockCell(x, y int) {
|
||||
if x < 0 || y < 0 {
|
||||
return
|
||||
}
|
||||
if x >= cb.w || y >= cb.h {
|
||||
return
|
||||
}
|
||||
c := &cb.cells[(y*cb.w)+x]
|
||||
c.lock = true
|
||||
}
|
||||
|
||||
// UnlockCell removes a lock from the cell and marks it as dirty
|
||||
func (cb *CellBuffer) UnlockCell(x, y int) {
|
||||
if x < 0 || y < 0 {
|
||||
return
|
||||
}
|
||||
if x >= cb.w || y >= cb.h {
|
||||
return
|
||||
}
|
||||
c := &cb.cells[(y*cb.w)+x]
|
||||
c.lock = false
|
||||
cb.SetDirty(x, y, true)
|
||||
}
|
||||
|
||||
// Resize is used to resize the cells array, with different dimensions,
|
||||
// while preserving the original contents. The cells will be invalidated
|
||||
// so that they can be redrawn.
|
||||
func (cb *CellBuffer) Resize(w, h int) {
|
||||
if cb.h == h && cb.w == w {
|
||||
return
|
||||
}
|
||||
|
||||
newc := make([]cell, w*h)
|
||||
for y := 0; y < h && y < cb.h; y++ {
|
||||
for x := 0; x < w && x < cb.w; x++ {
|
||||
oc := &cb.cells[(y*cb.w)+x]
|
||||
nc := &newc[(y*w)+x]
|
||||
nc.currMain = oc.currMain
|
||||
nc.currComb = oc.currComb
|
||||
nc.currStyle = oc.currStyle
|
||||
nc.width = oc.width
|
||||
nc.lastMain = rune(0)
|
||||
}
|
||||
}
|
||||
cb.cells = newc
|
||||
cb.h = h
|
||||
cb.w = w
|
||||
}
|
||||
|
||||
// Fill fills the entire cell buffer array with the specified character
|
||||
// and style. Normally choose ' ' to clear the screen. This API doesn't
|
||||
// support combining characters, or characters with a width larger than one.
|
||||
// If either the foreground or background are ColorNone, then the respective
|
||||
// color is unchanged.
|
||||
func (cb *CellBuffer) Fill(r rune, style Style) {
|
||||
for i := range cb.cells {
|
||||
c := &cb.cells[i]
|
||||
c.currMain = r
|
||||
c.currComb = nil
|
||||
cs := style
|
||||
if cs.fg == ColorNone {
|
||||
cs.fg = c.currStyle.fg
|
||||
}
|
||||
if cs.bg == ColorNone {
|
||||
cs.bg = c.currStyle.bg
|
||||
}
|
||||
c.currStyle = cs
|
||||
c.width = 1
|
||||
}
|
||||
}
|
||||
|
||||
var runeConfig *runewidth.Condition
|
||||
|
||||
func init() {
|
||||
// The defaults for the runewidth package are poorly chosen for terminal
|
||||
// applications. We however will honor the setting in the environment if
|
||||
// it is set.
|
||||
if os.Getenv("RUNEWIDTH_EASTASIAN") == "" {
|
||||
runewidth.DefaultCondition.EastAsianWidth = false
|
||||
}
|
||||
}
|
||||
22
vendor/github.com/gdamore/tcell/v2/charset_stub.go
generated
vendored
Normal file
22
vendor/github.com/gdamore/tcell/v2/charset_stub.go
generated
vendored
Normal file
@@ -0,0 +1,22 @@
|
||||
//go:build plan9 || nacl
|
||||
// +build plan9 nacl
|
||||
|
||||
// Copyright 2015 The TCell Authors
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 (the "License");
|
||||
// you may not use file except in compliance with the License.
|
||||
// You may obtain a copy of the license at
|
||||
//
|
||||
// http://www.apache.org/licenses/LICENSE-2.0
|
||||
//
|
||||
// Unless required by applicable law or agreed to in writing, software
|
||||
// distributed under the License is distributed on an "AS IS" BASIS,
|
||||
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
// See the License for the specific language governing permissions and
|
||||
// limitations under the License.
|
||||
|
||||
package tcell
|
||||
|
||||
func getCharset() string {
|
||||
return ""
|
||||
}
|
||||
50
vendor/github.com/gdamore/tcell/v2/charset_unix.go
generated
vendored
Normal file
50
vendor/github.com/gdamore/tcell/v2/charset_unix.go
generated
vendored
Normal file
@@ -0,0 +1,50 @@
|
||||
//go:build !windows && !nacl && !plan9
|
||||
// +build !windows,!nacl,!plan9
|
||||
|
||||
// Copyright 2016 The TCell Authors
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 (the "License");
|
||||
// you may not use file except in compliance with the License.
|
||||
// You may obtain a copy of the license at
|
||||
//
|
||||
// http://www.apache.org/licenses/LICENSE-2.0
|
||||
//
|
||||
// Unless required by applicable law or agreed to in writing, software
|
||||
// distributed under the License is distributed on an "AS IS" BASIS,
|
||||
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
// See the License for the specific language governing permissions and
|
||||
// limitations under the License.
|
||||
|
||||
package tcell
|
||||
|
||||
import (
|
||||
"os"
|
||||
"strings"
|
||||
)
|
||||
|
||||
func getCharset() string {
|
||||
// Determine the character set. This can help us later.
|
||||
// Per POSIX, we search for LC_ALL first, then LC_CTYPE, and
|
||||
// finally LANG. First one set wins.
|
||||
locale := ""
|
||||
if locale = os.Getenv("LC_ALL"); locale == "" {
|
||||
if locale = os.Getenv("LC_CTYPE"); locale == "" {
|
||||
locale = os.Getenv("LANG")
|
||||
}
|
||||
}
|
||||
if locale == "POSIX" || locale == "C" {
|
||||
return "US-ASCII"
|
||||
}
|
||||
if i := strings.IndexRune(locale, '@'); i >= 0 {
|
||||
locale = locale[:i]
|
||||
}
|
||||
if i := strings.IndexRune(locale, '.'); i >= 0 {
|
||||
locale = locale[i+1:]
|
||||
} else {
|
||||
// Default assumption, and on Linux we can see LC_ALL
|
||||
// without a character set, which we assume implies UTF-8.
|
||||
return "UTF-8"
|
||||
}
|
||||
// XXX: add support for aliases
|
||||
return locale
|
||||
}
|
||||
22
vendor/github.com/gdamore/tcell/v2/charset_windows.go
generated
vendored
Normal file
22
vendor/github.com/gdamore/tcell/v2/charset_windows.go
generated
vendored
Normal file
@@ -0,0 +1,22 @@
|
||||
//go:build windows
|
||||
// +build windows
|
||||
|
||||
// Copyright 2015 The TCell Authors
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 (the "License");
|
||||
// you may not use file except in compliance with the License.
|
||||
// You may obtain a copy of the license at
|
||||
//
|
||||
// http://www.apache.org/licenses/LICENSE-2.0
|
||||
//
|
||||
// Unless required by applicable law or agreed to in writing, software
|
||||
// distributed under the License is distributed on an "AS IS" BASIS,
|
||||
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
// See the License for the specific language governing permissions and
|
||||
// limitations under the License.
|
||||
|
||||
package tcell
|
||||
|
||||
func getCharset() string {
|
||||
return "UTF-16"
|
||||
}
|
||||
1128
vendor/github.com/gdamore/tcell/v2/color.go
generated
vendored
Normal file
1128
vendor/github.com/gdamore/tcell/v2/color.go
generated
vendored
Normal file
File diff suppressed because it is too large
Load Diff
53
vendor/github.com/gdamore/tcell/v2/colorfit.go
generated
vendored
Normal file
53
vendor/github.com/gdamore/tcell/v2/colorfit.go
generated
vendored
Normal file
@@ -0,0 +1,53 @@
|
||||
// Copyright 2016 The TCell Authors
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 (the "License");
|
||||
// you may not use file except in compliance with the License.
|
||||
// You may obtain a copy of the license at
|
||||
//
|
||||
// http://www.apache.org/licenses/LICENSE-2.0
|
||||
//
|
||||
// Unless required by applicable law or agreed to in writing, software
|
||||
// distributed under the License is distributed on an "AS IS" BASIS,
|
||||
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
// See the License for the specific language governing permissions and
|
||||
// limitations under the License.
|
||||
|
||||
package tcell
|
||||
|
||||
import (
|
||||
"math"
|
||||
|
||||
"github.com/lucasb-eyer/go-colorful"
|
||||
)
|
||||
|
||||
// FindColor attempts to find a given color, or the best match possible for it,
|
||||
// from the palette given. This is an expensive operation, so results should
|
||||
// be cached by the caller.
|
||||
func FindColor(c Color, palette []Color) Color {
|
||||
match := ColorDefault
|
||||
dist := float64(0)
|
||||
r, g, b := c.RGB()
|
||||
c1 := colorful.Color{
|
||||
R: float64(r) / 255.0,
|
||||
G: float64(g) / 255.0,
|
||||
B: float64(b) / 255.0,
|
||||
}
|
||||
for _, d := range palette {
|
||||
r, g, b = d.RGB()
|
||||
c2 := colorful.Color{
|
||||
R: float64(r) / 255.0,
|
||||
G: float64(g) / 255.0,
|
||||
B: float64(b) / 255.0,
|
||||
}
|
||||
// CIE94 is more accurate, but really really expensive.
|
||||
nd := c1.DistanceCIE76(c2)
|
||||
if math.IsNaN(nd) {
|
||||
nd = math.Inf(1)
|
||||
}
|
||||
if match == ColorDefault || nd < dist {
|
||||
match = d
|
||||
dist = nd
|
||||
}
|
||||
}
|
||||
return match
|
||||
}
|
||||
24
vendor/github.com/gdamore/tcell/v2/console_stub.go
generated
vendored
Normal file
24
vendor/github.com/gdamore/tcell/v2/console_stub.go
generated
vendored
Normal file
@@ -0,0 +1,24 @@
|
||||
//go:build !windows
|
||||
// +build !windows
|
||||
|
||||
// Copyright 2015 The TCell Authors
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 (the "License");
|
||||
// you may not use file except in compliance with the License.
|
||||
// You may obtain a copy of the license at
|
||||
//
|
||||
// http://www.apache.org/licenses/LICENSE-2.0
|
||||
//
|
||||
// Unless required by applicable law or agreed to in writing, software
|
||||
// distributed under the License is distributed on an "AS IS" BASIS,
|
||||
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
// See the License for the specific language governing permissions and
|
||||
// limitations under the License.
|
||||
|
||||
package tcell
|
||||
|
||||
// NewConsoleScreen returns a console based screen. This platform
|
||||
// doesn't have support for any, so it returns nil and a suitable error.
|
||||
func NewConsoleScreen() (Screen, error) {
|
||||
return nil, ErrNoScreen
|
||||
}
|
||||
1398
vendor/github.com/gdamore/tcell/v2/console_win.go
generated
vendored
Normal file
1398
vendor/github.com/gdamore/tcell/v2/console_win.go
generated
vendored
Normal file
File diff suppressed because it is too large
Load Diff
47
vendor/github.com/gdamore/tcell/v2/doc.go
generated
vendored
Normal file
47
vendor/github.com/gdamore/tcell/v2/doc.go
generated
vendored
Normal file
@@ -0,0 +1,47 @@
|
||||
// Copyright 2018 The TCell Authors
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 (the "License");
|
||||
// you may not use file except in compliance with the License.
|
||||
// You may obtain a copy of the license at
|
||||
//
|
||||
// http://www.apache.org/licenses/LICENSE-2.0
|
||||
//
|
||||
// Unless required by applicable law or agreed to in writing, software
|
||||
// distributed under the License is distributed on an "AS IS" BASIS,
|
||||
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
// See the License for the specific language governing permissions and
|
||||
// limitations under the License.
|
||||
|
||||
// Package tcell provides a lower-level, portable API for building
|
||||
// programs that interact with terminals or consoles. It works with
|
||||
// both common (and many uncommon!) terminals or terminal emulators,
|
||||
// and Windows console implementations.
|
||||
//
|
||||
// It provides support for up to 256 colors, text attributes, and box drawing
|
||||
// elements. A database of terminals built from a real terminfo database
|
||||
// is provided, along with code to generate new database entries.
|
||||
//
|
||||
// Tcell offers very rich support for mice, dependent upon the terminal
|
||||
// of course. (Windows, XTerm, and iTerm 2 are known to work very well.)
|
||||
//
|
||||
// If the environment is not Unicode by default, such as an ISO8859 based
|
||||
// locale or GB18030, Tcell can convert input and output, so that your
|
||||
// terminal can operate in whatever locale is most convenient, while the
|
||||
// application program can just assume "everything is UTF-8". Reasonable
|
||||
// defaults are used for updating characters to something suitable for
|
||||
// display. Unicode box drawing characters will be converted to use the
|
||||
// alternate character set of your terminal, if native conversions are
|
||||
// not available. If no ACS is available, then some ASCII fallbacks will
|
||||
// be used.
|
||||
//
|
||||
// Note that support for non-UTF-8 locales (other than C) must be enabled
|
||||
// by the application using RegisterEncoding() -- we don't have them all
|
||||
// enabled by default to avoid bloating the application unnecessarily.
|
||||
// (These days UTF-8 is good enough for almost everyone, and nobody should
|
||||
// be using legacy locales anymore.) Also, actual glyphs for various code
|
||||
// point will only be displayed if your terminal or emulator (or the font
|
||||
// the emulator is using) supports them.
|
||||
//
|
||||
// A rich set of key codes is supported, with support for up to 65 function
|
||||
// keys, and various other special keys.
|
||||
package tcell
|
||||
140
vendor/github.com/gdamore/tcell/v2/encoding.go
generated
vendored
Normal file
140
vendor/github.com/gdamore/tcell/v2/encoding.go
generated
vendored
Normal file
@@ -0,0 +1,140 @@
|
||||
// Copyright 2022 The TCell Authors
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 (the "License");
|
||||
// you may not use file except in compliance with the License.
|
||||
// You may obtain a copy of the license at
|
||||
//
|
||||
// http://www.apache.org/licenses/LICENSE-2.0
|
||||
//
|
||||
// Unless required by applicable law or agreed to in writing, software
|
||||
// distributed under the License is distributed on an "AS IS" BASIS,
|
||||
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
// See the License for the specific language governing permissions and
|
||||
// limitations under the License.
|
||||
|
||||
package tcell
|
||||
|
||||
import (
|
||||
"strings"
|
||||
"sync"
|
||||
|
||||
"golang.org/x/text/encoding"
|
||||
|
||||
gencoding "github.com/gdamore/encoding"
|
||||
)
|
||||
|
||||
var encodings map[string]encoding.Encoding
|
||||
var encodingLk sync.Mutex
|
||||
var encodingFallback EncodingFallback = EncodingFallbackFail
|
||||
|
||||
// RegisterEncoding may be called by the application to register an encoding.
|
||||
// The presence of additional encodings will facilitate application usage with
|
||||
// terminal environments where the I/O subsystem does not support Unicode.
|
||||
//
|
||||
// Windows systems use Unicode natively, and do not need any of the encoding
|
||||
// subsystem when using Windows Console screens.
|
||||
//
|
||||
// Please see the Go documentation for golang.org/x/text/encoding -- most of
|
||||
// the common ones exist already as stock variables. For example, ISO8859-15
|
||||
// can be registered using the following code:
|
||||
//
|
||||
// import "golang.org/x/text/encoding/charmap"
|
||||
//
|
||||
// ...
|
||||
// RegisterEncoding("ISO8859-15", charmap.ISO8859_15)
|
||||
//
|
||||
// Aliases can be registered as well, for example "8859-15" could be an alias
|
||||
// for "ISO8859-15".
|
||||
//
|
||||
// For POSIX systems, this package will check the environment variables
|
||||
// LC_ALL, LC_CTYPE, and LANG (in that order) to determine the character set.
|
||||
// These are expected to have the following pattern:
|
||||
//
|
||||
// $language[.$codeset[@$variant]
|
||||
//
|
||||
// We extract only the $codeset part, which will usually be something like
|
||||
// UTF-8 or ISO8859-15 or KOI8-R. Note that if the locale is either "POSIX"
|
||||
// or "C", then we assume US-ASCII (the POSIX 'portable character set'
|
||||
// and assume all other characters are somehow invalid.)
|
||||
//
|
||||
// Modern POSIX systems and terminal emulators may use UTF-8, and for those
|
||||
// systems, this API is also unnecessary. For example, Darwin (MacOS X) and
|
||||
// modern Linux running modern xterm generally will out of the box without
|
||||
// any of this. Use of UTF-8 is recommended when possible, as it saves
|
||||
// quite a lot processing overhead.
|
||||
//
|
||||
// Note that some encodings are quite large (for example GB18030 which is a
|
||||
// superset of Unicode) and so the application size can be expected to
|
||||
// increase quite a bit as each encoding is added.
|
||||
|
||||
// The East Asian encodings have been seen to add 100-200K per encoding to the
|
||||
// size of the resulting binary.
|
||||
func RegisterEncoding(charset string, enc encoding.Encoding) {
|
||||
encodingLk.Lock()
|
||||
charset = strings.ToLower(charset)
|
||||
encodings[charset] = enc
|
||||
encodingLk.Unlock()
|
||||
}
|
||||
|
||||
// EncodingFallback describes how the system behaves when the locale
|
||||
// requires a character set that we do not support. The system always
|
||||
// supports UTF-8 and US-ASCII. On Windows consoles, UTF-16LE is also
|
||||
// supported automatically. Other character sets must be added using the
|
||||
// RegisterEncoding API. (A large group of nearly all of them can be
|
||||
// added using the RegisterAll function in the encoding sub package.)
|
||||
type EncodingFallback int
|
||||
|
||||
const (
|
||||
// EncodingFallbackFail behavior causes GetEncoding to fail
|
||||
// when it cannot find an encoding.
|
||||
EncodingFallbackFail = iota
|
||||
|
||||
// EncodingFallbackASCII behavior causes GetEncoding to fall back
|
||||
// to a 7-bit ASCII encoding, if no other encoding can be found.
|
||||
EncodingFallbackASCII
|
||||
|
||||
// EncodingFallbackUTF8 behavior causes GetEncoding to assume
|
||||
// UTF8 can pass unmodified upon failure. Note that this behavior
|
||||
// is not recommended, unless you are sure your terminal can cope
|
||||
// with real UTF8 sequences.
|
||||
EncodingFallbackUTF8
|
||||
)
|
||||
|
||||
// SetEncodingFallback changes the behavior of GetEncoding when a suitable
|
||||
// encoding is not found. The default is EncodingFallbackFail, which
|
||||
// causes GetEncoding to simply return nil.
|
||||
func SetEncodingFallback(fb EncodingFallback) {
|
||||
encodingLk.Lock()
|
||||
encodingFallback = fb
|
||||
encodingLk.Unlock()
|
||||
}
|
||||
|
||||
// GetEncoding is used by Screen implementors who want to locate an encoding
|
||||
// for the given character set name. Note that this will return nil for
|
||||
// either the Unicode (UTF-8) or ASCII encodings, since we don't use
|
||||
// encodings for them but instead have our own native methods.
|
||||
func GetEncoding(charset string) encoding.Encoding {
|
||||
charset = strings.ToLower(charset)
|
||||
encodingLk.Lock()
|
||||
defer encodingLk.Unlock()
|
||||
if enc, ok := encodings[charset]; ok {
|
||||
return enc
|
||||
}
|
||||
switch encodingFallback {
|
||||
case EncodingFallbackASCII:
|
||||
return gencoding.ASCII
|
||||
case EncodingFallbackUTF8:
|
||||
return encoding.Nop
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
func init() {
|
||||
// We always support UTF-8 and ASCII.
|
||||
encodings = make(map[string]encoding.Encoding)
|
||||
encodings["utf-8"] = gencoding.UTF8
|
||||
encodings["utf8"] = gencoding.UTF8
|
||||
encodings["us-ascii"] = gencoding.ASCII
|
||||
encodings["ascii"] = gencoding.ASCII
|
||||
encodings["iso646"] = gencoding.ASCII
|
||||
}
|
||||
73
vendor/github.com/gdamore/tcell/v2/errors.go
generated
vendored
Normal file
73
vendor/github.com/gdamore/tcell/v2/errors.go
generated
vendored
Normal file
@@ -0,0 +1,73 @@
|
||||
// Copyright 2015 The TCell Authors
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 (the "License");
|
||||
// you may not use file except in compliance with the License.
|
||||
// You may obtain a copy of the license at
|
||||
//
|
||||
// http://www.apache.org/licenses/LICENSE-2.0
|
||||
//
|
||||
// Unless required by applicable law or agreed to in writing, software
|
||||
// distributed under the License is distributed on an "AS IS" BASIS,
|
||||
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
// See the License for the specific language governing permissions and
|
||||
// limitations under the License.
|
||||
|
||||
package tcell
|
||||
|
||||
import (
|
||||
"errors"
|
||||
"time"
|
||||
|
||||
"github.com/gdamore/tcell/v2/terminfo"
|
||||
)
|
||||
|
||||
var (
|
||||
// ErrTermNotFound indicates that a suitable terminal entry could
|
||||
// not be found. This can result from either not having TERM set,
|
||||
// or from the TERM failing to support certain minimal functionality,
|
||||
// in particular absolute cursor addressability (the cup capability)
|
||||
// is required. For example, legacy "adm3" lacks this capability,
|
||||
// whereas the slightly newer "adm3a" supports it. This failure
|
||||
// occurs most often with "dumb".
|
||||
ErrTermNotFound = terminfo.ErrTermNotFound
|
||||
|
||||
// ErrNoScreen indicates that no suitable screen could be found.
|
||||
// This may result from attempting to run on a platform where there
|
||||
// is no support for either termios or console I/O (such as nacl),
|
||||
// or from running in an environment where there is no access to
|
||||
// a suitable console/terminal device. (For example, running on
|
||||
// without a controlling TTY or with no /dev/tty on POSIX platforms.)
|
||||
ErrNoScreen = errors.New("no suitable screen available")
|
||||
|
||||
// ErrNoCharset indicates that the locale environment the
|
||||
// program is not supported by the program, because no suitable
|
||||
// encoding was found for it. This problem never occurs if
|
||||
// the environment is UTF-8 or UTF-16.
|
||||
ErrNoCharset = errors.New("character set not supported")
|
||||
|
||||
// ErrEventQFull indicates that the event queue is full, and
|
||||
// cannot accept more events.
|
||||
ErrEventQFull = errors.New("event queue full")
|
||||
)
|
||||
|
||||
// An EventError is an event representing some sort of error, and carries
|
||||
// an error payload.
|
||||
type EventError struct {
|
||||
t time.Time
|
||||
err error
|
||||
}
|
||||
|
||||
// When returns the time when the event was created.
|
||||
func (ev *EventError) When() time.Time {
|
||||
return ev.t
|
||||
}
|
||||
|
||||
// Error implements the error.
|
||||
func (ev *EventError) Error() string {
|
||||
return ev.err.Error()
|
||||
}
|
||||
|
||||
// NewEventError creates an ErrorEvent with the given error payload.
|
||||
func NewEventError(err error) *EventError {
|
||||
return &EventError{t: time.Now(), err: err}
|
||||
}
|
||||
53
vendor/github.com/gdamore/tcell/v2/event.go
generated
vendored
Normal file
53
vendor/github.com/gdamore/tcell/v2/event.go
generated
vendored
Normal file
@@ -0,0 +1,53 @@
|
||||
// Copyright 2015 The TCell Authors
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 (the "License");
|
||||
// you may not use file except in compliance with the License.
|
||||
// You may obtain a copy of the license at
|
||||
//
|
||||
// http://www.apache.org/licenses/LICENSE-2.0
|
||||
//
|
||||
// Unless required by applicable law or agreed to in writing, software
|
||||
// distributed under the License is distributed on an "AS IS" BASIS,
|
||||
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
// See the License for the specific language governing permissions and
|
||||
// limitations under the License.
|
||||
|
||||
package tcell
|
||||
|
||||
import (
|
||||
"time"
|
||||
)
|
||||
|
||||
// Event is a generic interface used for passing around Events.
|
||||
// Concrete types follow.
|
||||
type Event interface {
|
||||
// When reports the time when the event was generated.
|
||||
When() time.Time
|
||||
}
|
||||
|
||||
// EventTime is a simple base event class, suitable for easy reuse.
|
||||
// It can be used to deliver actual timer events as well.
|
||||
type EventTime struct {
|
||||
when time.Time
|
||||
}
|
||||
|
||||
// When returns the time stamp when the event occurred.
|
||||
func (e *EventTime) When() time.Time {
|
||||
return e.when
|
||||
}
|
||||
|
||||
// SetEventTime sets the time of occurrence for the event.
|
||||
func (e *EventTime) SetEventTime(t time.Time) {
|
||||
e.when = t
|
||||
}
|
||||
|
||||
// SetEventNow sets the time of occurrence for the event to the current time.
|
||||
func (e *EventTime) SetEventNow() {
|
||||
e.SetEventTime(time.Now())
|
||||
}
|
||||
|
||||
// EventHandler is anything that handles events. If the handler has
|
||||
// consumed the event, it should return true. False otherwise.
|
||||
type EventHandler interface {
|
||||
HandleEvent(Event) bool
|
||||
}
|
||||
28
vendor/github.com/gdamore/tcell/v2/focus.go
generated
vendored
Normal file
28
vendor/github.com/gdamore/tcell/v2/focus.go
generated
vendored
Normal file
@@ -0,0 +1,28 @@
|
||||
// Copyright 2023 The TCell Authors
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 (the "License");
|
||||
// you may not use file except in compliance with the License.
|
||||
// You may obtain a copy of the license at
|
||||
//
|
||||
// http://www.apache.org/licenses/LICENSE-2.0
|
||||
//
|
||||
// Unless required by applicable law or agreed to in writing, software
|
||||
// distributed under the License is distributed on an "AS IS" BASIS,
|
||||
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
// See the License for the specific language governing permissions and
|
||||
// limitations under the License.
|
||||
|
||||
package tcell
|
||||
|
||||
// EventFocus is a focus event. It is sent when the terminal window (or tab)
|
||||
// gets or loses focus.
|
||||
type EventFocus struct {
|
||||
*EventTime
|
||||
|
||||
// True if the window received focus, false if it lost focus
|
||||
Focused bool
|
||||
}
|
||||
|
||||
func NewEventFocus(focused bool) *EventFocus {
|
||||
return &EventFocus{Focused: focused}
|
||||
}
|
||||
41
vendor/github.com/gdamore/tcell/v2/interrupt.go
generated
vendored
Normal file
41
vendor/github.com/gdamore/tcell/v2/interrupt.go
generated
vendored
Normal file
@@ -0,0 +1,41 @@
|
||||
// Copyright 2015 The TCell Authors
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 (the "License");
|
||||
// you may not use file except in compliance with the License.
|
||||
// You may obtain a copy of the license at
|
||||
//
|
||||
// http://www.apache.org/licenses/LICENSE-2.0
|
||||
//
|
||||
// Unless required by applicable law or agreed to in writing, software
|
||||
// distributed under the License is distributed on an "AS IS" BASIS,
|
||||
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
// See the License for the specific language governing permissions and
|
||||
// limitations under the License.
|
||||
|
||||
package tcell
|
||||
|
||||
import (
|
||||
"time"
|
||||
)
|
||||
|
||||
// EventInterrupt is a generic wakeup event. Its can be used to
|
||||
// to request a redraw. It can carry an arbitrary payload, as well.
|
||||
type EventInterrupt struct {
|
||||
t time.Time
|
||||
v interface{}
|
||||
}
|
||||
|
||||
// When returns the time when this event was created.
|
||||
func (ev *EventInterrupt) When() time.Time {
|
||||
return ev.t
|
||||
}
|
||||
|
||||
// Data is used to obtain the opaque event payload.
|
||||
func (ev *EventInterrupt) Data() interface{} {
|
||||
return ev.v
|
||||
}
|
||||
|
||||
// NewEventInterrupt creates an EventInterrupt with the given payload.
|
||||
func NewEventInterrupt(data interface{}) *EventInterrupt {
|
||||
return &EventInterrupt{t: time.Now(), v: data}
|
||||
}
|
||||
470
vendor/github.com/gdamore/tcell/v2/key.go
generated
vendored
Normal file
470
vendor/github.com/gdamore/tcell/v2/key.go
generated
vendored
Normal file
@@ -0,0 +1,470 @@
|
||||
// Copyright 2016 The TCell Authors
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 (the "License");
|
||||
// you may not use file except in compliance with the License.
|
||||
// You may obtain a copy of the license at
|
||||
//
|
||||
// http://www.apache.org/licenses/LICENSE-2.0
|
||||
//
|
||||
// Unless required by applicable law or agreed to in writing, software
|
||||
// distributed under the License is distributed on an "AS IS" BASIS,
|
||||
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
// See the License for the specific language governing permissions and
|
||||
// limitations under the License.
|
||||
|
||||
package tcell
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"strings"
|
||||
"time"
|
||||
)
|
||||
|
||||
// EventKey represents a key press. Usually this is a key press followed
|
||||
// by a key release, but since terminal programs don't have a way to report
|
||||
// key release events, we usually get just one event. If a key is held down
|
||||
// then the terminal may synthesize repeated key presses at some predefined
|
||||
// rate. We have no control over that, nor visibility into it.
|
||||
//
|
||||
// In some cases, we can have a modifier key, such as ModAlt, that can be
|
||||
// generated with a key press. (This usually is represented by having the
|
||||
// high bit set, or in some cases, by sending an ESC prior to the rune.)
|
||||
//
|
||||
// If the value of Key() is KeyRune, then the actual key value will be
|
||||
// available with the Rune() method. This will be the case for most keys.
|
||||
// In most situations, the modifiers will not be set. For example, if the
|
||||
// rune is 'A', this will be reported without the ModShift bit set, since
|
||||
// really can't tell if the Shift key was pressed (it might have been CAPSLOCK,
|
||||
// or a terminal that only can send capitals, or keyboard with separate
|
||||
// capital letters from lower case letters).
|
||||
//
|
||||
// Generally, terminal applications have far less visibility into keyboard
|
||||
// activity than graphical applications. Hence, they should avoid depending
|
||||
// overly much on availability of modifiers, or the availability of any
|
||||
// specific keys.
|
||||
type EventKey struct {
|
||||
t time.Time
|
||||
mod ModMask
|
||||
key Key
|
||||
ch rune
|
||||
}
|
||||
|
||||
// When returns the time when this Event was created, which should closely
|
||||
// match the time when the key was pressed.
|
||||
func (ev *EventKey) When() time.Time {
|
||||
return ev.t
|
||||
}
|
||||
|
||||
// Rune returns the rune corresponding to the key press, if it makes sense.
|
||||
// The result is only defined if the value of Key() is KeyRune.
|
||||
func (ev *EventKey) Rune() rune {
|
||||
return ev.ch
|
||||
}
|
||||
|
||||
// Key returns a virtual key code. We use this to identify specific key
|
||||
// codes, such as KeyEnter, etc. Most control and function keys are reported
|
||||
// with unique Key values. Normal alphanumeric and punctuation keys will
|
||||
// generally return KeyRune here; the specific key can be further decoded
|
||||
// using the Rune() function.
|
||||
func (ev *EventKey) Key() Key {
|
||||
return ev.key
|
||||
}
|
||||
|
||||
// Modifiers returns the modifiers that were present with the key press. Note
|
||||
// that not all platforms and terminals support this equally well, and some
|
||||
// cases we will not not know for sure. Hence, applications should avoid
|
||||
// using this in most circumstances.
|
||||
func (ev *EventKey) Modifiers() ModMask {
|
||||
return ev.mod
|
||||
}
|
||||
|
||||
// KeyNames holds the written names of special keys. Useful to echo back a key
|
||||
// name, or to look up a key from a string value.
|
||||
var KeyNames = map[Key]string{
|
||||
KeyEnter: "Enter",
|
||||
KeyBackspace: "Backspace",
|
||||
KeyTab: "Tab",
|
||||
KeyBacktab: "Backtab",
|
||||
KeyEsc: "Esc",
|
||||
KeyBackspace2: "Backspace2",
|
||||
KeyDelete: "Delete",
|
||||
KeyInsert: "Insert",
|
||||
KeyUp: "Up",
|
||||
KeyDown: "Down",
|
||||
KeyLeft: "Left",
|
||||
KeyRight: "Right",
|
||||
KeyHome: "Home",
|
||||
KeyEnd: "End",
|
||||
KeyUpLeft: "UpLeft",
|
||||
KeyUpRight: "UpRight",
|
||||
KeyDownLeft: "DownLeft",
|
||||
KeyDownRight: "DownRight",
|
||||
KeyCenter: "Center",
|
||||
KeyPgDn: "PgDn",
|
||||
KeyPgUp: "PgUp",
|
||||
KeyClear: "Clear",
|
||||
KeyExit: "Exit",
|
||||
KeyCancel: "Cancel",
|
||||
KeyPause: "Pause",
|
||||
KeyPrint: "Print",
|
||||
KeyF1: "F1",
|
||||
KeyF2: "F2",
|
||||
KeyF3: "F3",
|
||||
KeyF4: "F4",
|
||||
KeyF5: "F5",
|
||||
KeyF6: "F6",
|
||||
KeyF7: "F7",
|
||||
KeyF8: "F8",
|
||||
KeyF9: "F9",
|
||||
KeyF10: "F10",
|
||||
KeyF11: "F11",
|
||||
KeyF12: "F12",
|
||||
KeyF13: "F13",
|
||||
KeyF14: "F14",
|
||||
KeyF15: "F15",
|
||||
KeyF16: "F16",
|
||||
KeyF17: "F17",
|
||||
KeyF18: "F18",
|
||||
KeyF19: "F19",
|
||||
KeyF20: "F20",
|
||||
KeyF21: "F21",
|
||||
KeyF22: "F22",
|
||||
KeyF23: "F23",
|
||||
KeyF24: "F24",
|
||||
KeyF25: "F25",
|
||||
KeyF26: "F26",
|
||||
KeyF27: "F27",
|
||||
KeyF28: "F28",
|
||||
KeyF29: "F29",
|
||||
KeyF30: "F30",
|
||||
KeyF31: "F31",
|
||||
KeyF32: "F32",
|
||||
KeyF33: "F33",
|
||||
KeyF34: "F34",
|
||||
KeyF35: "F35",
|
||||
KeyF36: "F36",
|
||||
KeyF37: "F37",
|
||||
KeyF38: "F38",
|
||||
KeyF39: "F39",
|
||||
KeyF40: "F40",
|
||||
KeyF41: "F41",
|
||||
KeyF42: "F42",
|
||||
KeyF43: "F43",
|
||||
KeyF44: "F44",
|
||||
KeyF45: "F45",
|
||||
KeyF46: "F46",
|
||||
KeyF47: "F47",
|
||||
KeyF48: "F48",
|
||||
KeyF49: "F49",
|
||||
KeyF50: "F50",
|
||||
KeyF51: "F51",
|
||||
KeyF52: "F52",
|
||||
KeyF53: "F53",
|
||||
KeyF54: "F54",
|
||||
KeyF55: "F55",
|
||||
KeyF56: "F56",
|
||||
KeyF57: "F57",
|
||||
KeyF58: "F58",
|
||||
KeyF59: "F59",
|
||||
KeyF60: "F60",
|
||||
KeyF61: "F61",
|
||||
KeyF62: "F62",
|
||||
KeyF63: "F63",
|
||||
KeyF64: "F64",
|
||||
KeyCtrlA: "Ctrl-A",
|
||||
KeyCtrlB: "Ctrl-B",
|
||||
KeyCtrlC: "Ctrl-C",
|
||||
KeyCtrlD: "Ctrl-D",
|
||||
KeyCtrlE: "Ctrl-E",
|
||||
KeyCtrlF: "Ctrl-F",
|
||||
KeyCtrlG: "Ctrl-G",
|
||||
KeyCtrlJ: "Ctrl-J",
|
||||
KeyCtrlK: "Ctrl-K",
|
||||
KeyCtrlL: "Ctrl-L",
|
||||
KeyCtrlN: "Ctrl-N",
|
||||
KeyCtrlO: "Ctrl-O",
|
||||
KeyCtrlP: "Ctrl-P",
|
||||
KeyCtrlQ: "Ctrl-Q",
|
||||
KeyCtrlR: "Ctrl-R",
|
||||
KeyCtrlS: "Ctrl-S",
|
||||
KeyCtrlT: "Ctrl-T",
|
||||
KeyCtrlU: "Ctrl-U",
|
||||
KeyCtrlV: "Ctrl-V",
|
||||
KeyCtrlW: "Ctrl-W",
|
||||
KeyCtrlX: "Ctrl-X",
|
||||
KeyCtrlY: "Ctrl-Y",
|
||||
KeyCtrlZ: "Ctrl-Z",
|
||||
KeyCtrlSpace: "Ctrl-Space",
|
||||
KeyCtrlUnderscore: "Ctrl-_",
|
||||
KeyCtrlRightSq: "Ctrl-]",
|
||||
KeyCtrlBackslash: "Ctrl-\\",
|
||||
KeyCtrlCarat: "Ctrl-^",
|
||||
}
|
||||
|
||||
// Name returns a printable value or the key stroke. This can be used
|
||||
// when printing the event, for example.
|
||||
func (ev *EventKey) Name() string {
|
||||
s := ""
|
||||
m := []string{}
|
||||
if ev.mod&ModShift != 0 {
|
||||
m = append(m, "Shift")
|
||||
}
|
||||
if ev.mod&ModAlt != 0 {
|
||||
m = append(m, "Alt")
|
||||
}
|
||||
if ev.mod&ModMeta != 0 {
|
||||
m = append(m, "Meta")
|
||||
}
|
||||
if ev.mod&ModCtrl != 0 {
|
||||
m = append(m, "Ctrl")
|
||||
}
|
||||
|
||||
ok := false
|
||||
if s, ok = KeyNames[ev.key]; !ok {
|
||||
if ev.key == KeyRune {
|
||||
s = "Rune[" + string(ev.ch) + "]"
|
||||
} else {
|
||||
s = fmt.Sprintf("Key[%d,%d]", ev.key, int(ev.ch))
|
||||
}
|
||||
}
|
||||
if len(m) != 0 {
|
||||
if ev.mod&ModCtrl != 0 && strings.HasPrefix(s, "Ctrl-") {
|
||||
s = s[5:]
|
||||
}
|
||||
return fmt.Sprintf("%s+%s", strings.Join(m, "+"), s)
|
||||
}
|
||||
return s
|
||||
}
|
||||
|
||||
// NewEventKey attempts to create a suitable event. It parses the various
|
||||
// ASCII control sequences if KeyRune is passed for Key, but if the caller
|
||||
// has more precise information it should set that specifically. Callers
|
||||
// that aren't sure about modifier state (most) should just pass ModNone.
|
||||
func NewEventKey(k Key, ch rune, mod ModMask) *EventKey {
|
||||
if k == KeyRune && (ch < ' ' || ch == 0x7f) {
|
||||
// Turn specials into proper key codes. This is for
|
||||
// control characters and the DEL.
|
||||
k = Key(ch)
|
||||
if mod == ModNone && ch < ' ' {
|
||||
switch Key(ch) {
|
||||
case KeyBackspace, KeyTab, KeyEsc, KeyEnter:
|
||||
// these keys are directly typeable without CTRL
|
||||
default:
|
||||
// most likely entered with a CTRL keypress
|
||||
mod = ModCtrl
|
||||
}
|
||||
}
|
||||
}
|
||||
return &EventKey{t: time.Now(), key: k, ch: ch, mod: mod}
|
||||
}
|
||||
|
||||
// ModMask is a mask of modifier keys. Note that it will not always be
|
||||
// possible to report modifier keys.
|
||||
type ModMask int16
|
||||
|
||||
// These are the modifiers keys that can be sent either with a key press,
|
||||
// or a mouse event. Note that as of now, due to the confusion associated
|
||||
// with Meta, and the lack of support for it on many/most platforms, the
|
||||
// current implementations never use it. Instead, they use ModAlt, even for
|
||||
// events that could possibly have been distinguished from ModAlt.
|
||||
const (
|
||||
ModShift ModMask = 1 << iota
|
||||
ModCtrl
|
||||
ModAlt
|
||||
ModMeta
|
||||
ModNone ModMask = 0
|
||||
)
|
||||
|
||||
// Key is a generic value for representing keys, and especially special
|
||||
// keys (function keys, cursor movement keys, etc.) For normal keys, like
|
||||
// ASCII letters, we use KeyRune, and then expect the application to
|
||||
// inspect the Rune() member of the EventKey.
|
||||
type Key int16
|
||||
|
||||
// This is the list of named keys. KeyRune is special however, in that it is
|
||||
// a place holder key indicating that a printable character was sent. The
|
||||
// actual value of the rune will be transported in the Rune of the associated
|
||||
// EventKey.
|
||||
const (
|
||||
KeyRune Key = iota + 256
|
||||
KeyUp
|
||||
KeyDown
|
||||
KeyRight
|
||||
KeyLeft
|
||||
KeyUpLeft
|
||||
KeyUpRight
|
||||
KeyDownLeft
|
||||
KeyDownRight
|
||||
KeyCenter
|
||||
KeyPgUp
|
||||
KeyPgDn
|
||||
KeyHome
|
||||
KeyEnd
|
||||
KeyInsert
|
||||
KeyDelete
|
||||
KeyHelp
|
||||
KeyExit
|
||||
KeyClear
|
||||
KeyCancel
|
||||
KeyPrint
|
||||
KeyPause
|
||||
KeyBacktab
|
||||
KeyF1
|
||||
KeyF2
|
||||
KeyF3
|
||||
KeyF4
|
||||
KeyF5
|
||||
KeyF6
|
||||
KeyF7
|
||||
KeyF8
|
||||
KeyF9
|
||||
KeyF10
|
||||
KeyF11
|
||||
KeyF12
|
||||
KeyF13
|
||||
KeyF14
|
||||
KeyF15
|
||||
KeyF16
|
||||
KeyF17
|
||||
KeyF18
|
||||
KeyF19
|
||||
KeyF20
|
||||
KeyF21
|
||||
KeyF22
|
||||
KeyF23
|
||||
KeyF24
|
||||
KeyF25
|
||||
KeyF26
|
||||
KeyF27
|
||||
KeyF28
|
||||
KeyF29
|
||||
KeyF30
|
||||
KeyF31
|
||||
KeyF32
|
||||
KeyF33
|
||||
KeyF34
|
||||
KeyF35
|
||||
KeyF36
|
||||
KeyF37
|
||||
KeyF38
|
||||
KeyF39
|
||||
KeyF40
|
||||
KeyF41
|
||||
KeyF42
|
||||
KeyF43
|
||||
KeyF44
|
||||
KeyF45
|
||||
KeyF46
|
||||
KeyF47
|
||||
KeyF48
|
||||
KeyF49
|
||||
KeyF50
|
||||
KeyF51
|
||||
KeyF52
|
||||
KeyF53
|
||||
KeyF54
|
||||
KeyF55
|
||||
KeyF56
|
||||
KeyF57
|
||||
KeyF58
|
||||
KeyF59
|
||||
KeyF60
|
||||
KeyF61
|
||||
KeyF62
|
||||
KeyF63
|
||||
KeyF64
|
||||
)
|
||||
|
||||
const (
|
||||
// These key codes are used internally, and will never appear to applications.
|
||||
keyPasteStart Key = iota + 16384
|
||||
keyPasteEnd
|
||||
)
|
||||
|
||||
// These are the control keys. Note that they overlap with other keys,
|
||||
// perhaps. For example, KeyCtrlH is the same as KeyBackspace.
|
||||
const (
|
||||
KeyCtrlSpace Key = iota
|
||||
KeyCtrlA
|
||||
KeyCtrlB
|
||||
KeyCtrlC
|
||||
KeyCtrlD
|
||||
KeyCtrlE
|
||||
KeyCtrlF
|
||||
KeyCtrlG
|
||||
KeyCtrlH
|
||||
KeyCtrlI
|
||||
KeyCtrlJ
|
||||
KeyCtrlK
|
||||
KeyCtrlL
|
||||
KeyCtrlM
|
||||
KeyCtrlN
|
||||
KeyCtrlO
|
||||
KeyCtrlP
|
||||
KeyCtrlQ
|
||||
KeyCtrlR
|
||||
KeyCtrlS
|
||||
KeyCtrlT
|
||||
KeyCtrlU
|
||||
KeyCtrlV
|
||||
KeyCtrlW
|
||||
KeyCtrlX
|
||||
KeyCtrlY
|
||||
KeyCtrlZ
|
||||
KeyCtrlLeftSq // Escape
|
||||
KeyCtrlBackslash
|
||||
KeyCtrlRightSq
|
||||
KeyCtrlCarat
|
||||
KeyCtrlUnderscore
|
||||
)
|
||||
|
||||
// Special values - these are fixed in an attempt to make it more likely
|
||||
// that aliases will encode the same way.
|
||||
|
||||
// These are the defined ASCII values for key codes. They generally match
|
||||
// with KeyCtrl values.
|
||||
const (
|
||||
KeyNUL Key = iota
|
||||
KeySOH
|
||||
KeySTX
|
||||
KeyETX
|
||||
KeyEOT
|
||||
KeyENQ
|
||||
KeyACK
|
||||
KeyBEL
|
||||
KeyBS
|
||||
KeyTAB
|
||||
KeyLF
|
||||
KeyVT
|
||||
KeyFF
|
||||
KeyCR
|
||||
KeySO
|
||||
KeySI
|
||||
KeyDLE
|
||||
KeyDC1
|
||||
KeyDC2
|
||||
KeyDC3
|
||||
KeyDC4
|
||||
KeyNAK
|
||||
KeySYN
|
||||
KeyETB
|
||||
KeyCAN
|
||||
KeyEM
|
||||
KeySUB
|
||||
KeyESC
|
||||
KeyFS
|
||||
KeyGS
|
||||
KeyRS
|
||||
KeyUS
|
||||
KeyDEL Key = 0x7F
|
||||
)
|
||||
|
||||
// These keys are aliases for other names.
|
||||
const (
|
||||
KeyBackspace = KeyBS
|
||||
KeyTab = KeyTAB
|
||||
KeyEsc = KeyESC
|
||||
KeyEscape = KeyESC
|
||||
KeyEnter = KeyCR
|
||||
KeyBackspace2 = KeyDEL
|
||||
)
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user