Installation
Last updated
Was this helpful?
Last updated
Was this helpful?
Install Docker Desktop: https://www.docker.com/products/docker-desktop
You must allocate a minimum of 4GB memory for the docker.
Download cloudio_demo.zip & unzip
Launch CloudIO Apps using any modern browser http://localhost
Sign-in as demo / demo
Refer to the following links to download and install Apache Kafka 2.7.1 https://www.apache.org/dyn/closer.cgi?path=/kafka/2.7.1/kafka_2.13-2.7.1.tgz https://kafka.apache.org/quickstart https://www.confluent.io/confluent-cloud/pricing
Refer to the following link for MySQL installation https://dev.mysql.com/doc/mysql-installer/en/
Refer to the following link to install Redis https://redis.io/topics/quickstart
You can use either a hardware or software load balancer for load balancing, reverse proxy and SSL termination. e.g. nginx, apache etc. Refer to the following link to install nginx https://www.nginx.com/resources/wiki/start/topics/tutorials/install/
Configure your load balancer/reverse proxy to redirect the incoming HTTPS & WSS requests to the host(s)/port(s) on which the CloudIO Platform is configured to run.
Make sure to either use a trusted network or SSL/TLS-enabled Redis, Kafka & MySQL
Once you obtain a license from CloudIO, follow the instructions to download cloudio-platform.zip and unzip to a directory e.g. /mnt/cloudio and update the .env file with appropriate values for the following environment variables
API
Set it to true to enable the API Service (UI Backend)
Yes
true, false
SCHEDULER
Set it to true to enable the scheduler service
Yes
true, false
WORKFLOW
Set it to true to enable the multi-node workflow service
Yes
true, false
IO_ENV
development/test/production
Yes
development/test/production
development
JWT_SECRET
Used to encode/decode JWT tokens
Yes
secret
ARGON_SECRET
Used for password hashing
Yes
secret
ARGON_ITERATIONS
Used for a number of iterations to generate password hashing
No
ARGON_MEMORY_SIZE
Used for an amount of memory to generate password hashing
No
INSTANCE_ID
A unique name for this instance
Yes
cloudio_node1
HOST
An IP address and port combination on which the web server listens for incoming connections. You can run multiple instances on the same host with different ports and/or on multiple hosts depending on the load. A single instance can scale upto a million requests per 20 minutes.
Yes
127.0.0.1:3090
API_RATELIMIT
Number of API calls allowed per IP address per hour
Yes
1000
AUTH_RATELIMIT
Number of sign-in API calls allowed per IP address per hour
Yes
12
STATUS_RATELIMIT
Number of stutus API calls allowed per IP address per hour
Yes
60
TMP_DIR
Temp directory path
Yes
tmp
ENCRYPTED_ARGS
Set it to N to encrypt few env variables(JWT_SECRET, ARGON_SECRET, DATABASE_URL, READONLY_DATABASE_URL REDIS_URL, SMTP_PASSWORD, SASL_PASSWORD, ADMIN_PASSWORD, DB_PKCS12_PASSWORD)
No
Y,N
N
ADMIN_EMAIL
For the first time installation, platform will create a admin user with this given email address
No
ADMIN_PASSWORD
For the first time installation, platform will create a admin user with this given password
No
JS_DIR
Directory for the js libraries
Yes
js
THUMBNAILS_DIR
Directory for storing thumbnails
Yes
thumbnails
MULTI_TENANT
To enable multi tenant setup
Yes
N
X_FRAME_OPTION
Set a value for this to add a X_FRAME_OPTION header for server requests
DENY,SAMEORIGIN,_
DENY
MFA
Set it to EMAIL to enable MFA for sign-in (send a code to email while signing in)
Yes
EMAIL,OFF
OFF
MD_DB_TYPE
Metadata database type
Yes
mysql,postgres,oracle
mysql
LIVE_INTERVAL_SECONDS
Set a value in seconds to send live updates to clients
Yes
15
SQL_TIMEOUT_SECONDS
Set a value in seconds to set the timeout for the SQL query
Yes
120
MAX_CONCURRENT_REQUESTS
Set a value to allow maximum concurrent client requests on the server
Yes
40
SCHEDULER_SLEEP_SECONDS
Set a value to set a sleep time before running schedulers
Yes
60
PUBLIC_TINY_URL
Set it true to allow the public user to create a shared URL
Yes
AGENT
To enable Agent setup
No
DATABASE_URL
Specify the Database Url to connect metadata.
Yes
READONLY_DATABASE_URL
Used for running ad hoc queries from SQL Worksheet
Yes
Same as DATABASE_URL with readonly database user
ROOT_DATABASE_URL
Same as DATABASE_URL with root database user
No
Same as DATABASE_URL with root database user
DB_ROOT_CERT_PATH
CA cert path
No
DB_PKCS12_PATH
Private key in PKCS12 format
No
DB_PKCS12_PASSWORD
Private key password if any
No
DB_ACCEPT_INVALID_CERTS
To accept invalid certs (self signed certs)
No
true,false
DB_SKIP_DOMAIN_VALIDATION
To skip domain validation
No
true,false
ALLOW_SQL_WORKSHEET_UPDATES
Whether or not to allow ad hoc updates via. SQL Worksheet. Set this to N in Production & UAT instance.
Yes
Y,N
N
AZURE_CLIENT_SECRET
Azure key vault account client secret
No
AZURE_CLIENT_ID
Azure key vault account client id
No
AZURE_TENANT_ID
Azure key vault account tenant id
No
AZURE_KEY_VAULT_URL
Azure key vault url
No
AZURE_STORAGE_ACCOUNT
Azure storage account
No
AZURE_STORAGE_MASTER_KEY
Azure storage account master key
No
REDIS_PREFIX
To assign a prefix value for keys stored in Redis
No
dev
dev
REDIS_URL
URL of the Redis server
No
KAFKA_PREFIX
To assign a prefix value for topic names before creating in Kafka
No
BOOTSTRAP_SERVERS
Kafka bootstrap server URL. If using a cloud instance from confluent then provide appropriate values for the additional variables SECURITY_PROTOCOL, SASL_MECHANISMS, SASL_USERNAME & SASL_PASSWORD provided by confluent cloud when creating a new Kafka cluster
No
#Local Kafka
BOOTSTRAP_SERVERS=localhost:9092
#Kafka on Confluent Cloud
BOOTSTRAP_SERVERS=p...5.us-west-2.aws.confluent.cloud:9092 SECURITY_PROTOCOL=SASL_SSL SASL_MECHANISMS=PLAIN SASL_USERNAME=SR4C...OP4DIA SASL_PASSWORD=j4StZg8Kg7m...B5Kgant9A
SECURITY_PROTOCOL
No
SASL_MECHANISMS
No
SASL_USERNAME
No
SASL_PASSWORD
No
LOG_OUTPUT
file or console
Yes
console, file
file
LOG_SQLS
Set it true to log the SQL queries and it's params
Yes
true,false
false
LOG_VIEWER_KEY
Key to access a logs without a session
Yes
viG_D6Zo6mtXDAt_3Z
ENABLE_LOG_VIEWER_USING_KEY
Set it true to access the logs without a session using a unique key
Yes
true,false
true
EMAIL_PROVIDER
Different email providers
Yes
GMAIL, SMTP
SMTP_HOST
SMTP Host Name to be used for sending email alerts
Yes
SMTP_PORT
SMTP port number
No
SMTP_USE_TLS
To enable SMTPS
No
true,false
SMTP_USERNAME
SMTP Username
if EMAIL Yes
SMTP_PASSWORD
SMTP Password
if EMAIL Yes
GMAIL_CREDENTIAL_FILE_PATH
GMAIL Service Account Credentials Path
if GMAIL Yes
SMTP_FROM
From email address to be used for the outbound emails
Yes
Change directory to where the cloudio-platfrom.zip is extracted and run ./start.sh
from command prompt. The platform will install all necessary database objects and create necessary kafka topics as needed at startup.
When you start the server for the very first time, all the necessary tables would get created and populated with initial seed data. The platform will also create the initial admin
user with full privileges. You must setup the following environment variables (only for the first time startup)
Environment Variable
Description
ADMIN_EMAIL
Admin user's email address. This needs to be a valid email, otherwise you cannot reset/change the password.
ADMIN_PASSWORD
Password to be used for the newly created admin
user
AES 256 with IV is used for encryption/decryption
You must setup the environment variable SECRET
with a super secure key. Once set, you must not change the value as it may be used to encrypt your application data. We will provide a CLI option to change the SECRET, which will automate the processing re-encrypting the data with the new SECRET.
The following environment variables must be encrypted before starting the server. You can use the sub-command encrypt
(see an example below) to encrypt all the required values
Sample command to encrypt the REDIS_URL environment value
Environment Variable to be encrypted
JWT_SECRET
ARGON_SECRET
DATABASE_URL
READONLY_DATABASE_URL
REDIS_URL
SMTP_PASSWORD
SASL_PASSWORD
ADMIN_PASSWORD
DB_PKCS12_PASSWORD
If the server has to serve more than a million requests per hour, you must set up a scalable cluster for Kafka & Redis. The database must be scaled up according to the usage, and multiple platform instances must be run parallel to support the load.
Make sure to set up regular backups for MySQL.
For simple deployment, you can disable Kafka, Blob Storage & Redis.
Use Cases for Single Node Deployment
Development Instances
Trial Instances
Production Instances with less than 3000 users and when scaling/high availability is not necessary