What is SSH & Terminal?
SSH & Terminal lets you execute commands on remote servers directly from your automation workflows. Whether you need to run a Python data analysis script, manage files on a server, or deploy code, you can do it all without leaving Autonoly.
Every command runs inside a secure, isolated environment with a full Linux shell. You get the same capabilities you'd have if you SSH'd into a server yourself — but automated, repeatable, and integrated with the rest of your workflow pipeline.
When to Use SSH & Terminal
SSH & Terminal is the right choice when your automation needs go beyond what a browser can do:
Processing large datasets that need server-side compute power
Running Python scripts with specialized libraries
Managing files across servers
Deploying code or running build pipelines
Collecting server metrics and health checks
Server-Side Python
The most popular use of SSH & Terminal is running Python scripts with full library access. Unlike browser-based JavaScript, server-side Python gives you access to the entire Python ecosystem.
Pre-Installed Libraries
Common data science and automation libraries come pre-installed:
pandas — dataframe manipulation, CSV/Excel I/O, data cleaning
numpy — numerical computing, array operations, statistical functions
requests — HTTP client for API calls from within your scripts
beautifulsoup4 — HTML/XML parsing for post-processing scraped content
You can also install any package with pip at runtime. Need a specialized library for geocoding, NLP, or financial calculations? Just include the pip install command in your script and it's available immediately.
How Python Scripts Work
- Write your Python script in the workflow node configuration
- Reference input data from previous steps using variables
- The script executes in an isolated environment with full stdout/stderr capture
- Output is captured and passed to the next step in your workflow
This makes Python scripts perfect for tasks like:
Data transformation — reshape, pivot, merge datasets from data extraction
ML inference — run trained models on extracted data for classification or prediction
Report generation — create charts, PDFs, or formatted summaries with matplotlib or reportlab
Custom calculations — financial modeling, statistical analysis, scoring algorithms
Combine server-side Python with Data Processing nodes for a flexible data pipeline.
File Management
SSH & Terminal includes full file management capabilities for moving data in and out of your automation:
Upload files to servers — push input files, configuration, or datasets to the execution environment
Download results — retrieve generated reports, processed files, or analysis output
Cloud storage transfer — upload files to cloud storage and generate shareable download URLs
50MB per file limit — for cloud uploads; direct server-to-server transfers have no limit
File Workflow Example
A typical file-heavy workflow looks like this:
- Extract data from a website into a structured dataset
- Upload the dataset to the server environment
- Run a Python script that processes, analyzes, and generates a PDF report
- Upload the PDF to cloud storage
- Send the download link via email or Slack
Real-World Pipelines
SSH & Terminal becomes most powerful when combined with other Autonoly features in end-to-end pipelines:
Data Analysis Pipeline
Scrape competitor pricing data from multiple e-commerce sites, then SSH into a server to run a Python analysis that calculates price trends, identifies outliers, and generates a comparative report. Finally, push the results to Google Sheets and notify your team on Slack.
Server Monitoring
Set up scheduled workflows that SSH into your production servers, run health check commands, collect CPU/memory/disk metrics, and route alerts based on thresholds. Use Logic & Flow for conditional alerting — only notify when metrics exceed normal ranges.
Deployment Automation
Build CI/CD-style pipelines: pull the latest code from a repository, run test suites, build artifacts, and deploy to staging or production. Use webhook triggers to kick off the pipeline automatically when code is pushed. Add Slack notifications at each stage so your team stays informed.
Research Pipeline
Collect data from academic databases or government portals using browser automation, then run statistical analysis with Python's scipy and statsmodels libraries. Generate publication-ready charts and export to Google Drive.
Security
Security is built into every layer of SSH & Terminal:
Encrypted credentials — all SSH keys and passwords are stored with AES-256 encryption
Isolated execution — each workflow run gets a fresh, clean environment that is destroyed after completion
Standard SSH protocols — all connections use industry-standard SSH encryption
No data persistence — execution environments are ephemeral; nothing lingers between runs
See Security features for complete details on how Autonoly protects your data and credentials.
Check the pricing page for details on execution time limits and available compute resources per plan. Browse the templates library for pre-built server-side automation workflows.