Hi there, since you decided to move your project from Supabase to Insforge. Migrations can be daunting. You might wonder how do you move all your users without forcing a password reset? How do you transfer every single file and update all the links? What about your database schema and all that precious data etc…
Don't worry. We got you.
This guide introduces a complete, production-tested migration toolkit designed to seamlessly transition your entire Supabase project to a self-hosted Insforge GitHub repository instance. We're not just talking about a partial data dump; we're talking about a full migration that preserves:
This toolkit has been battle-tested in a real production environment, successfully migrating 39 users, 985 database rows, and over 1,500 storage files with zero data loss. For more information on self-hosting, check out the Insforge GitHub repository.
Let's get started.
Before we begin, let's get your environment set up for a smooth migration.
First, ensure you have the necessary tools and access:
psql)Now, let's configure the migration tool itself.
First, clone the toolkit Supabase to Insforge GitHub repository (or download the source) and install the dependencies.
# Clone the repository (if applicable)
# git clone <https://github.com/InsForge/supabase-to-insforge>
# Install dependencies
npm install
Next, create your environment configuration file. Copy the example file to create your own .env file.
cp .env.example .env
Now, open the newly created .env file and fill in the credentials for both the Supabase you decided to migrate and new Insforge projects.
To fill out the .env file, you'll need three key pieces of information from your Supabase project dashboard.
SUPABASE_URL & SUPABASE_SERVICE_ROLE_KEY:
service_role secret key.SUPABASE_DB_URL (Your Connection String):
[YOUR-PASSWORD] placeholder in the string with your actual database password. If you've forgotten it, you can reset it in Project Settings > Database.
# ============================================
# Supabase Configuration
# ============================================
# Find these in credentials in the supabase settings
SUPABASE_URL=https://YOUR_PROJECT.supabase.co
SUPABASE_SERVICE_ROLE_KEY=your_supabase_service_role_key
SUPABASE_DB_URL=postgresql://postgres.xxx:[YOUR-PASSWORD]@...
# ============================================
# Insforge Configuration
# ============================================
# Your Insforge instance details
INSFORGE_API_URL=http://localhost:7130 # Or your remote URL
INSFORGE_API_KEY=your_insforge_api_key
Before running the migration, let's make sure the tool can connect to both databases. Run these commands to test your connections:
# Test Supabase connection
psql "$SUPABASE_DB_URL" -c "SELECT version();"
# Test Insforge connection (adjust for your local/remote setup)
psql postgresql://postgres:postgres@localhost:5432/insforge -c "SELECT version();"
# Exit
\\q
If both commands execute without errors, you are ready for the main event!
The migration is broken down into three logical phases. Follow them in order for the best results.
This is often the trickiest part of any migration, but this toolkit makes it simple. We will migrate all user accounts while preserving their original passwords.
This command connects to your Supabase project and exports all users from the auth.users table into a local JSON file.
npm run export:auth
Expected Output:
✅ Connected to Supabase PostgreSQL
✅ Exported 39 users to auth-export.json
This creates a file named auth-export.json containing user details, including their IDs and encrypted passwords.
Now, we'll import the users from the auth-export.json file into your Insforge instance. The script preserves the original user IDs (critical for foreign keys) and their bcrypt-hashed passwords.
npm run import:auth
Expected Output:
✅ Connected to Insforge API
URL: <https://your-insforge-instance.insforge.app>
📦 Loaded export data...
⬆️ Importing users...
✅ Imported 41/41 accounts
✅ Import complete!
Pro Tip: This script is idempotent, meaning you can run it multiple times without creating duplicate users. It will simply update existing records if they are found.
Next, we'll move your entire PostgreSQL database, including the schema, data, and even your Row Level Security (RLS) policies.
This command uses pg_dump to create a complete SQL backup of your public schemas from Supabase.
npm run export:db
Expected Output:
✅ Database exported successfully
File: database-export.sql (245 KB)
This will generate a database-export.sql file.
Supabase uses some specific functions and extensions. This script cleans the exported SQL file, removes Supabase-specific code, updates RLS policies (e.g., auth.uid() becomes uid()), and makes it compatible with Insforge.
npm run transform:db
Expected Output:
✅ SQL transformed for Insforge
Output: database-export.insforge.sql
This creates a new, Insforge-ready file: database-export.insforge.sql.
Finally, let's import the transformed SQL file into your Insforge project.
npm run import:db
Expected Output:
🗄️ Importing database to Insforge...
✅ Database import complete!
Tables affected: 12
Rows imported: 985
Your database schema and data are now successfully migrated!
The final phase is to move all your files from Supabase Storage to Insforge Storage. This process preserves the exact file paths, which is crucial for preventing broken links.
First, the script will read the list of buckets from your Supabase project and create identical ones in Insforge, preserving their public or private settings.
npm run create:buckets
Expected Output:
✅ Connected to Supabase PostgreSQL
📦 Found 4 buckets in Supabase:
📋 Creating bucket: generated-images (public)... ✅ Created successfully
📋 Creating bucket: raw-product-images (public)... ✅ Created successfully
...
✅ Bucket creation complete!
This command downloads every file from all your Supabase buckets, preserving the original directory structure.
npm run export:storage
Expected Output:
📦 Starting storage export from Supabase...
⬇️ Downloading files...
[1/1572] generated-images/user123/image.jpg ✅
...
[1572/1572] reference-images/ref5.jpg ✅
✅ Export complete! Downloaded: 1,572 files
All files will be saved locally in a storage-downloads/ directory.
Now, we'll upload all the downloaded files to their corresponding buckets in Insforge.
npm run import:storage
Expected Output:
📦 Found 1572 files to import
⬆️ [1/1572] generated-images/user123/scene/image.jpg... ✅ Uploaded
...
✅ Import complete! Success: 1569/1572
This is the magic wand. Your database likely contains thousands of URLs pointing to Supabase Storage. This script connects to your Insforge database and performs a universal find-and-replace on all tables. It intelligently updates every Supabase storage URL to the new Insforge URL, even inside complex JSONB fields.
npm run update:storage-urls
Expected Output:
✅ Connected to Insforge API
🔄 Universal URL Replacement...
📊 Found:
- 997 generations with Supabase URLs in input
- 1023 generations with Supabase URLs in output
⏳ Universally updating ALL URLs...
✅ All Supabase storage URLs have been updated!
And that's it! Your entire project is now migrated.
Your data is migrated, but you're not done yet. Here are the final steps to get your application running on Insforge.
Update Application Code: Change your application's client configuration to point to your Insforge instance instead of Supabase. For detailed guidance on reconfiguring the SDK using MCP, check out our companion blog post.
// Before (Supabase)
const supabase = createClient('SUPABASE_URL', 'SUPABASE_ANON_KEY');
// After (Insforge)
const insforge = createClient('YOUR_INSFORGE_API_URL', 'YOUR_INSFORGE_API_KEY');
End-to-End Testing: Thoroughly test your application.
Clean Up: Once you've confirmed everything is working, you can remove the temporary migration files (storage-downloads/, .sql, .json).
File too large error during storage import: Your file exceeds the Insforge instance's upload limit. You may need to increase the limit or migrate the file manually.import:auth script ran successfully. The bcrypt password hashes should have been preserved as-is..env connection strings are correct.Happy migrating! 🚀