Skip to content
Inicio

/

Funcionalidades

/

SSH

/

SSH & Terminal

SSH

Atualizado em marco de 2026

SSH & Terminal

Run commands on remote servers, execute Python scripts, transfer files, and build full server-side pipelines — all inside secure, isolated cloud environments.

Sem cartao de credito

14 dias de teste gratis

Cancele quando quiser

Nesta Pagina

Como Funciona

Comece em minutos

1

Connect to a server

Enter your server credentials or use key-based authentication.

2

Run commands

Execute shell commands, scripts, and manage files remotely.

3

Process data

Run Python scripts with full library access for custom processing.

4

Get results back

Download files or push results to cloud storage and integrations.

What is SSH & Terminal?

SSH & Terminal lets you execute commands on remote servers directly from your automation workflows. Whether you need to run a Python data analysis script, manage files on a server, or deploy code, you can do it all without leaving Autonoly.

Every command runs inside a secure, isolated environment with a full Linux shell. You get the same capabilities you'd have if you SSH'd into a server yourself — but automated, repeatable, and integrated with the rest of your workflow pipeline.

When to Use SSH & Terminal

SSH & Terminal is the right choice when your automation needs go beyond what a browser can do:

  • Processing large datasets that need server-side compute power

  • Running Python scripts with specialized libraries

  • Managing files across servers

  • Deploying code or running build pipelines

  • Collecting server metrics and health checks

Server-Side Python

The most popular use of SSH & Terminal is running Python scripts with full library access. Unlike browser-based JavaScript, server-side Python gives you access to the entire Python ecosystem.

Pre-Installed Libraries

Common data science and automation libraries come pre-installed:

  • pandas — dataframe manipulation, CSV/Excel I/O, data cleaning

  • numpy — numerical computing, array operations, statistical functions

  • requests — HTTP client for API calls from within your scripts

  • beautifulsoup4 — HTML/XML parsing for post-processing scraped content

You can also install any package with pip at runtime. Need a specialized library for geocoding, NLP, or financial calculations? Just include the pip install command in your script and it's available immediately.

How Python Scripts Work

  1. Write your Python script in the workflow node configuration
  2. Reference input data from previous steps using variables
  3. The script executes in an isolated environment with full stdout/stderr capture
  4. Output is captured and passed to the next step in your workflow

This makes Python scripts perfect for tasks like:

  • Data transformation — reshape, pivot, merge datasets from data extraction

  • ML inference — run trained models on extracted data for classification or prediction

  • Report generation — create charts, PDFs, or formatted summaries with matplotlib or reportlab

  • Custom calculations — financial modeling, statistical analysis, scoring algorithms

Combine server-side Python with Data Processing nodes for a flexible data pipeline.

File Management

SSH & Terminal includes full file management capabilities for moving data in and out of your automation:

  • Upload files to servers — push input files, configuration, or datasets to the execution environment

  • Download results — retrieve generated reports, processed files, or analysis output

  • Cloud storage transfer — upload files to cloud storage and generate shareable download URLs

  • 50MB per file limit — for cloud uploads; direct server-to-server transfers have no limit

File Workflow Example

A typical file-heavy workflow looks like this:

  1. Extract data from a website into a structured dataset
  2. Upload the dataset to the server environment
  3. Run a Python script that processes, analyzes, and generates a PDF report
  4. Upload the PDF to cloud storage
  5. Send the download link via email or Slack

Real-World Pipelines

SSH & Terminal becomes most powerful when combined with other Autonoly features in end-to-end pipelines:

Data Analysis Pipeline

Scrape competitor pricing data from multiple e-commerce sites, then SSH into a server to run a Python analysis that calculates price trends, identifies outliers, and generates a comparative report. Finally, push the results to Google Sheets and notify your team on Slack.

Server Monitoring

Set up scheduled workflows that SSH into your production servers, run health check commands, collect CPU/memory/disk metrics, and route alerts based on thresholds. Use Logic & Flow for conditional alerting — only notify when metrics exceed normal ranges.

Deployment Automation

Build CI/CD-style pipelines: pull the latest code from a repository, run test suites, build artifacts, and deploy to staging or production. Use webhook triggers to kick off the pipeline automatically when code is pushed. Add Slack notifications at each stage so your team stays informed.

Research Pipeline

Collect data from academic databases or government portals using browser automation, then run statistical analysis with Python's scipy and statsmodels libraries. Generate publication-ready charts and export to Google Drive.

Security

Security is built into every layer of SSH & Terminal:

  • Encrypted credentials — all SSH keys and passwords are stored with AES-256 encryption

  • Isolated execution — each workflow run gets a fresh, clean environment that is destroyed after completion

  • Standard SSH protocols — all connections use industry-standard SSH encryption

  • No data persistence — execution environments are ephemeral; nothing lingers between runs

See Security features for complete details on how Autonoly protects your data and credentials.

Check the pricing page for details on execution time limits and available compute resources per plan. Browse the templates library for pre-built server-side automation workflows.

Capacidades

Tudo em SSH & Terminal

Ferramentas poderosas que trabalham juntas para automatizar seus fluxos de ponta a ponta.

01

SSH Connection

Connect to any server with key-based or password authentication. Persistent sessions across workflow steps.

Key & password auth

Persistent connections

Port forwarding

Jump host support

02

Command Execution

Run any shell command and capture stdout/stderr. Chain commands, use pipes, and handle exit codes.

Shell command execution

Output capture

Exit code handling

Environment variables

03

File Transfer

Upload and download files between your workflow and remote servers via SCP/SFTP.

File upload (SCP)

File download (SCP)

Directory transfer

Large file support

04

Python Scripts

Execute Python scripts on remote containers with package installation and file output.

Python 3 runtime

pip install

File I/O

Library access

05

Isolated Environments

Every execution runs in an isolated, clean environment. Your scripts get a fresh setup every time.

Isolated execution

Fresh environments

No setup needed

Auto-cleanup

06

Cloud Storage Upload

Transfer files from containers to S3/cloud storage. Generate download URLs for sharing.

S3 upload

Download URL generation

50MB file limit

Base64 encoding

Casos de Uso

O Que Voce Pode Criar

Automacoes do mundo real que pessoas criam com SSH & Terminal todos os dias.

01

Data Pipelines

Scrape data with the browser, process it with Python on a server, and deliver results to your tools.

02

Server Monitoring

Run health checks, collect metrics, and alert your team via Slack or email when issues arise.

03

Deployment Automation

Pull code, run tests, build artifacts, and deploy — triggered on schedule or via webhook.

FAQ

Perguntas Comuns

Tudo o que voce precisa saber sobre SSH & Terminal.

Pronto para experimentar SSH & Terminal?

Junte-se a milhares de equipes que automatizam seu trabalho com a Autonoly. Comece gratis, sem cartao de credito.

Sem cartao de credito

14 dias de teste gratis

Cancele quando quiser