Skip to content
Autonoly

Data

تم التحديث في مارس 2026

Database

Connect your workflows directly to MongoDB, MySQL, and PostgreSQL databases. Run queries, insert records, update data, and perform complex aggregations — all without writing backend code. Turn extracted web data into persistent, queryable datasets.

بدون بطاقة ائتمان

تجربة مجانية لمدة 14 يومًا

إلغاء في أي وقت

في هذه الصفحة

كيف يعمل

ابدأ في دقائق

1

Add connection

Enter your database credentials securely. Supports connection strings and individual field configuration.

2

Choose operation

Select from CRUD operations, aggregations, transactions, or bulk actions.

3

Configure query

Build your query visually or write raw SQL/MongoDB queries directly.

4

Execute and use results

Run the query and use results in subsequent workflow steps or export them.

What is the Database Feature?

The Database feature connects your Autonoly workflows directly to relational and document databases. Instead of exporting data to CSV files and manually importing them, your workflows can read from and write to databases in real-time.

This closes the loop between data collection and data storage. Extract data from websites with Data Extraction, process it with Data Processing, and write it directly to your database — all in a single automated workflow. If you are building automated data pipelines for the first time, our guide to automating Google Sheets covers similar concepts with a simpler starting point.

Supported Databases

Autonoly supports the three most popular database systems:

  • PostgreSQL — the industry standard for relational data. Full SQL support including joins, CTEs, window functions, and JSON operations.

  • MySQL — widely used in web applications. Compatible with MySQL 5.7+ and MariaDB.

  • MongoDB — document database for flexible, schema-free data. Full support for the aggregation pipeline, text search, and geospatial queries.

Connection setup supports both connection strings (paste a URI) and field-by-field configuration (host, port, database name, username, password). The system validates the connection before saving it, so you know immediately if the credentials are correct.

CRUD Operations

Every standard database operation is supported:

Create (Insert)

Insert single records or bulk-insert thousands of rows in one operation. Data from any workflow step can be mapped to database columns automatically. Type conversion handles common mismatches — strings to dates, numbers to strings, and JSON to document fields.

Read (Query)

Run any query your database supports. For SQL databases, write full SELECT statements with joins, subqueries, and aggregations. For MongoDB, build find queries with filters, projections, and sorts. Results flow into the next workflow step as structured data.

The query editor includes syntax highlighting, auto-completion for table and column names, and a live preview that shows sample results before you commit the query to the workflow. This makes it easy to iterate on complex queries without running the full workflow each time.

Update

Update records based on conditions. Map workflow data to the fields you want to change. Supports single-record updates and bulk updates with WHERE/filter conditions.

Delete

Remove records by condition. Safety checks prevent accidental mass deletion — you must explicitly confirm operations that affect more than a configurable threshold of records.

Advanced Operations

Aggregations

Build complex analytics queries directly in your workflows:

  • SQL: GROUP BY, HAVING, window functions, CTEs

  • MongoDB: aggregation pipeline with $group, $match, $lookup, $unwind

Transactions

Wrap multiple operations in a transaction to ensure data consistency. If any step fails, all changes are rolled back. Essential for financial data, inventory management, and any process where partial updates would cause problems.

Bulk Operations

Insert, update, or delete thousands of records efficiently:

  • Batch insert — insert up to 100,000 records per operation with automatic chunking

  • Bulk update — update multiple records with different values in a single operation

  • Upsert — insert if new, update if existing, based on a unique key

Schema Inspection

Before writing queries, you can browse your database schema directly within Autonoly. The schema inspector shows tables, columns, data types, indexes, and relationships for SQL databases, and collections, sample documents, and field frequencies for MongoDB. This eliminates the need to switch to a separate database client while building workflows.

Security

Database credentials are stored in the encrypted credential vault — the same system used by all Autonoly features. Connections use SSL/TLS encryption. You can restrict database access to specific IP ranges using your database's built-in access controls, and Autonoly provides static IP addresses for whitelisting.

Integration with Workflows

The Database feature pairs with every data-related capability in Autonoly:

  • [Data Extraction](/features/data-extraction) — extract data from websites and write it to your database automatically

  • [Data Processing](/features/data-processing) — transform data in your workflow before database insertion

  • [Scheduled Execution](/features/scheduled-execution) — run database sync jobs on a recurring schedule

  • [Logic & Flow](/features/logic-flow) — add conditions to decide what gets inserted, updated, or skipped

  • [API & HTTP](/features/api-http) — combine API data with database queries for enriched datasets

  • [Webhooks](/features/webhooks) — trigger database operations from external events

Best Practices

  • Use upserts instead of separate insert-or-update logic. When your workflow extracts data that may or may not already exist in the database, upsert operations are cleaner and faster than querying first and then deciding whether to insert or update. Define a unique key (like email address or product SKU) and let the database handle the conflict resolution.

  • Set up indexes for columns you query frequently. If your workflows read from the database using filters — for example, looking up a lead by email before enriching it — make sure those columns are indexed. The schema inspector can show you which indexes exist and suggest new ones based on your query patterns.

  • Use transactions for multi-step writes. When a workflow writes to multiple tables or collections in sequence, wrap the operations in a transaction. This ensures that if step three fails, steps one and two are rolled back, keeping your data consistent. This is especially important for financial and inventory workflows.

  • Paginate large query results. Instead of pulling 100,000 rows in a single query, use LIMIT/OFFSET (SQL) or skip/limit (MongoDB) to process data in manageable chunks. This prevents memory issues in your workflow and keeps execution times predictable. Learn about building robust data pipelines in our web scraping best practices guide.

  • Test queries with the live preview before running workflows. The query editor's preview mode runs your query against the actual database and shows sample results. Use this to verify that your query returns the expected data before committing it to a production workflow.

Security & Compliance

Database connections are among the most sensitive credentials in any organization. Autonoly stores all database credentials using AES-256 encryption in the credential vault. Credentials are decrypted only at the moment of connection and are never written to logs, workflow definitions, or API responses.

All database connections are encrypted in transit using SSL/TLS. For databases that require client certificate authentication, Autonoly supports uploading and storing client certificates in the vault alongside the connection credentials. Autonoly provides static IP addresses that you can whitelist in your database's firewall rules, ensuring that only Autonoly's infrastructure can connect.

For organizations with strict access controls, database connections can be restricted by workspace role. An admin can configure which team members can create new connections, which can run read queries, and which can execute write operations. Every database operation is logged in the audit trail with the user who initiated it, the query executed, the number of rows affected, and the timestamp. These logs are available in the security dashboard and can be exported for compliance reviews. Our comparison of automation platforms highlights how Autonoly's database security compares to alternatives.

Common Use Cases

Web Scraping Data Warehouse

A market research team scrapes product data from dozens of e-commerce sites daily using Data Extraction and Browser Automation. Each scrape run writes thousands of product records — names, prices, descriptions, availability, ratings — into a PostgreSQL database via upsert operations. The unique key is the product URL, so existing records are updated with fresh data and new products are inserted automatically. Over time, the database accumulates a comprehensive price history that powers trend analysis dashboards and reporting. This pattern is detailed in our e-commerce price monitoring guide.

CRM Enrichment Pipeline

A sales team maintains a MongoDB database of prospects. When new leads enter the database from Webhooks (e.g., from a form submission), a workflow triggers automatically. The workflow reads the new lead record, visits the lead's company website using Browser Automation, extracts company size, industry, funding stage, and recent news using Data Extraction, and writes the enriched data back to the lead record. The entire enrichment process runs in under a minute, ensuring that by the time a sales rep opens the lead record, the context is already there.

Inventory Sync Across Suppliers

A retail company tracks inventory from multiple suppliers in a MySQL database. Scheduled workflows run every four hours, visiting each supplier's portal with Browser Automation to check stock levels. The workflow compares the scraped quantities against the current database values and updates any changes. When stock drops below a reorder threshold, Logic & Flow triggers an alert via Email Campaigns to the procurement team. The transaction system ensures that the stock check and alert happen atomically — no partial updates if a step fails.

Lead Deduplication and Cleanup

Over time, databases accumulate duplicate records from multiple data sources. A weekly Scheduled Execution workflow reads all lead records, runs them through Data Processing for deduplication based on email address and company name, and merges duplicate records while preserving the most complete data from each. The cleaned results are written back to the database in a single transaction. Before the cleanup runs, a backup query exports the current data set, providing a safety net in case the merge logic needs adjustment.

Check pricing for database connection limits and query volume per plan.

القدرات

كل شيء في Database

أدوات قوية تعمل معًا لأتمتة سير عملك من البداية إلى النهاية.

01

Multi-Database Support

Connect to PostgreSQL, MySQL, and MongoDB with full feature support for each platform.

PostgreSQL

MySQL / MariaDB

MongoDB

SSL connections

02

Full CRUD

Insert, read, update, and delete records with visual query builders or raw query input.

Insert / bulk insert

Query with filters

Conditional updates

Safe delete with thresholds

03

Aggregations

Run complex analytics queries including GROUP BY, window functions, and MongoDB aggregation pipelines.

SQL aggregations

MongoDB pipelines

Window functions

CTE support

04

Transactions

Wrap multiple operations in transactions for data consistency. Automatic rollback on failure.

Multi-operation transactions

Auto-rollback

Consistency guarantees

Deadlock handling

05

Bulk Operations

Insert, update, or upsert up to 100K records per operation with automatic batching.

100K record batches

Automatic chunking

Upsert support

Progress tracking

06

Visual Query Builder

Build queries visually by selecting tables, columns, and conditions — or switch to raw SQL/MongoDB syntax.

Drag-and-drop builder

Raw query mode

Query preview

Result preview

حالات الاستخدام

ما يمكنك بناؤه

أتمتات واقعية يبنيها الأشخاص باستخدام Database كل يوم.

01

Data Warehousing

Extract data from multiple websites and consolidate it in a central database for analysis.

02

CRM Sync

Keep your database in sync with web-sourced data by running scheduled extraction and insertion workflows.

03

Inventory Tracking

Scrape product availability from supplier sites and update inventory records in your database automatically.

الأسئلة الشائعة

أسئلة شائعة

كل ما تحتاج معرفته عن Database.

مستعد لتجربة Database؟

انضم إلى آلاف الفرق التي تؤتمت عملها مع Autonoly. ابدأ مجانًا، بدون بطاقة ائتمان.

بدون بطاقة ائتمان

تجربة مجانية لمدة 14 يومًا

إلغاء في أي وقت