Docker is the easiest way to get started with the self-hosted version of Lago.
You can start using the app by using a one-click Docker command in a shell:
You can now open your browser and go to http://localhost to connect to the application. Just after signing up, Lago’s API is exposed at http://localhost:3000.
If you don’t want to use the one-click Docker command, you can start using Lago by running more advanced commands in a shell:
You can now open your browser and go to http://localhost to connect to the application. Just after signing up, Lago’s API is exposed at http://localhost:3000.
It’s mandatory to create your organization by signing up to Lago. This organization is the core object of your biller as it’s used to invoice your customers.
organization name
;email
of your company; andpassword
for this email.You will be able to invite other email addresses within the application. If you already have an account, you can also log in. Once you are able to access the app, you can retrieve your API key.
Your API Key can be found directly in the UI:
Docker images are always updated to the last stable version in the
docker-compose.yml
file. You can use a different tag if needed by checking the
releases list.
We recommend to avoid the usage of latest
tag, you should use the last
tagged version, you can track what are the last version on Dockerhub
Lago uses the following environment variables to configure the components of the application. You can override them to customise your setup.
Variable | Default value | Description |
---|---|---|
POSTGRES_HOST | db | (With Docker compose) Host name of the postgres server |
POSTGRES_DB | lago | (With Docker compose) Name of the postgres database |
POSTGRES_USER | lago | (With Docker compose) Database user for postgres connection |
POSTGRES_PASSWORD | changeme | (With Docker compose) Database password for postgres connection |
POSTGRES_PORT | 5432 | (With Docker compose) Port the postgres database listens to |
POSTGRES_SCHEMA | public | Name of the postgres schema |
DATABASE_URL | (Without docker compose) Full url to the postgres server | |
DATABASE_POOL | 10 | Max number of connection opened to the postgres database per api, worker and clock instances |
DATABASE_PREPARED_STATEMENTS | true | Enable or disable prepared statements in the postgres database |
REDIS_HOST | redis | Host name of the redis server |
REDIS_PORT | 6379 | Port the redis database listens to |
REDIS_PASSWORD | Password of the redis server | |
LAGO_REDIS_CACHE_HOST | redis | Host name of the redis cache server |
LAGO_REDIS_CACHE_PORT | 6379 | Port the redis cache server listens to |
LAGO_REDIS_CACHE_PASSWORD | Password of the redis cache server | |
LAGO_REDIS_CACHE_POOL_SIZE | 5 | Max number of connections in the redis cache connection pool |
LAGO_MEMCACHE_SERVERS | Coma separated list of memcache servers | |
LAGO_FRONT_URL | http://localhost | URL of the Lago front-end application.Used for CORS configuration |
FRONT_PORT | 80 | Port the front-end application listens to |
LAGO_API_URL | http://localhost:3000 | URL of the Lago back-end application |
API_URL | http://localhost:3000 | URL of the Lago back-end application defined for the front image |
API_PORT | 3000 | Port the back-end application listens to |
SECRET_KEY_BASE | your-secret-key-base-hex-64 | Secret key used for session encryption |
SENTRY_DSN | Sentry DSN key for error and performance tracking on Lago back-end | |
SENTRY_DSN_FRONT | Sentry DSN key for error and performance tracking on Lago front-end | |
LAGO_RSA_PRIVATE_KEY | Private key used for webhook signatures | |
LAGO_SIDEKIQ_WEB | Activate the Sidekiq web UI, disabled by default | |
LAGO_ENCRYPTION_PRIMARY_KEY | your-encryption-primary-key | Encryption primary key used to secure sensitive values stored in the database |
LAGO_ENCRYPTION_DETERMINISTIC_KEY | your-encryption-deterministic-key | Encryption deterministic key used to secure sensitive values stored in the database |
LAGO_ENCRYPTION_KEY_DERIVATION_SALT | your-encryption-derivation-salt | Encryption key salt used to secure sensitive values stored in the database |
LAGO_WEBHOOK_ATTEMPTS | 3 | Number of failed attempt before stopping to deliver a webhook |
LAGO_USE_AWS_S3 | false | Use AWS S3 for files storage |
LAGO_AWS_S3_ACCESS_KEY_ID | azerty123456 | AWS Access Key id that has access to S3 |
LAGO_AWS_S3_SECRET_ACCESS_KEY | azerty123456 | AWS Secret Access Key that has access to S3 |
LAGO_AWS_S3_REGION | us-east-1 | AWS S3 Region |
LAGO_AWS_S3_BUCKET | bucket | AWS S3 Bucket name |
LAGO_AWS_S3_ENDPOINT | S3 compatible storage endpoint. Should be set only if you are using another storage provider than AWS S3 | |
LAGO_USE_GCS | false | Use Google Cloud Service Cloud Storage for file storage, ⚠️ If you want to use GCS, you have to pass the credentials json key file to the api and worker service |
LAGO_GCS_PROJECT | GCS Project name | |
LAGO_GCS_BUCKET | GCS Bucket Name | |
LAGO_PDF_URL | http://pdf:3000 | PDF Service URL on your infrastructure |
LAGO_DISABLE_SIGNUP | Disable Sign up when running Lago in self-hosted | |
LAGO_RAILS_STDOUT | true | Set to true to activate logs on containers |
LAGO_DISABLE_WALLET_REFRESH | Disable automatic refresh of wallet ongoing balance | |
LAGO_DISABLE_PDF_GENERATION | false | Disable automatic PDF generation for invoices, credit notes, and receipts. As a result, the corresponding download endpoints will be unavailable |
GOOGLE_AUTH_CLIENT_ID | Client ID for Google auth Single Sign On | |
GOOGLE_AUTH_CLIENT_SECRET | Client Secret for Google auth Single Sign On |
We recommend that you change POSTGRES_PASSWORD
, SECRET_KEY_BASE
,
LAGO_RSA_PRIVATE_KEY
, LAGO_ENCRYPTION_PRIMARY_KEY
,
LAGO_ENCRYPTION_DETERMINISTIC_KEY
and LAGO_ENCRYPTION_KEY_DERIVATION_SALT
to
improve the security of your Lago instance:
SECRET_KEY_BASE
can be generated using the openssl rand -hex 64
command.LAGO_RSA_PRIVATE_KEY
can be generated using the
openssl genrsa 2048 | base64
command.LAGO_ENCRYPTION_PRIMARY_KEY
, LAGO_ENCRYPTION_DETERMINISTIC_KEY
and
LAGO_ENCRYPTION_KEY_DERIVATION_SALT
can all be gerated using the
cat /dev/urandom | tr -dc 'a-zA-Z0-9' | fold -w 32 | head -n 1
command.Lago uses the following containers:
Container | Role |
---|---|
front | Front-end application |
api | API back-end application |
api_worker | Asynchronous worker for the API application |
api_clock | Clock worker for the API application |
db | Postgres database engine used to store application data |
redis | Redis database engine used as a queuing system for asynchronous tasks |
pdf | PDF generation powered by Gotenberg |
You can also use your own database or Redis server. To do so, remove the db
and redis
configurations from the docker-compose.yml
file and update the
environment variables accordingly.
Lago Front application can be configured to support SSL certificates. You have two options to achieve this:
docker-compose.yml
file and uncomment the part related to
the Self-Signed certificateextra/init-letsencrypt.sh
lago.example
with your domain nameextra/nginx-letsencrypt.conf
lago.example
with your domain namedocker-compose.yml
filedocker-compose.yml
file and uncomment all the parts
related to the Let’s Encrypt’s supportBy default, Lago uses the internal storage of the container. You can customize it by defining different environment variables.
We currently support :
If you use S3 compatibles endpoints, you should set the LAGO_AWS_S3_REGION
to a default value (ei: us-east-1
), it is required to work properly!
You have to set these variables to use AWS S3.
Name | Description |
---|---|
LAGO_USE_AWS_S3 | Set to “true” if you want to use AWS S3 |
LAGO_AWS_S3_ACCESS_KEY_ID | AWS S3 Credentials Access Key Id |
LAGO_AWS_S3_SECRET_ACCESS_KEY | AWS S3 Credentials Secret Access Key |
LAGO_AWS_S3_REGION | AWS S3 Region |
LAGO_AWS_S3_BUCKET | AWS S3 Bucket |
You have to set these variables to use AWS S3 Compatible Endpoints.
Name | Description |
---|---|
LAGO_USE_AWS_S3 | Set to “true” if you want to use AWS S3 Compatible Endpoints |
LAGO_AWS_S3_ENDPOINT | AWS S3 Compatible Endpoint |
LAGO_AWS_S3_ACCESS_KEY_ID | AWS S3 Credentials Access Key Id |
LAGO_AWS_S3_SECRET_ACCESS_KEY | AWS S3 Credentials Secret Access Key |
LAGO_AWS_S3_BUCKET | AWS S3 Bucket |
LAGO_AWS_S3_REGION | Not used but required by the AWS SDK |
You have to set those variables to use GCS Cloud Storage.
Name | Description |
---|---|
LAGO_USE_GCS | Set to “true” if you want to use GCS Cloud Storage |
LAGO_GCS_PROJECT | GCS Project name |
LAGO_GCS_BUCKET | GCS Bucket name |
In the docker-compose.yml
file, you must uncomment the lines and pass the
correct GCS credentials json file.
In order to use the email feature, you need to configure some environment variables.
In addition to this configuration, defining an organization email in Settings > Organization is mandatory; without it, emails will not be sent.
Name | Description |
---|---|
LAGO_FROM_EMAIL | Required to send emails (i.e: noreply@getlago.com) |
LAGO_SMTP_ADDRESS | Address of the SMTP server |
LAGO_SMTP_PORT | Port of the SMTP Server |
LAGO_SMTP_USERNAME | Username of the SMTP Server |
LAGO_SMTP_PASSWORD | Password of the SMTP Server |
In order to enable Google authentication for single sign on, you have to set those variables.
Name | Description |
---|---|
GOOGLE_AUTH_CLIENT_ID | Client ID for Google auth Single Sign On |
GOOGLE_AUTH_CLIENT_SECRET | Client Secret for Google auth Single Sign On |