3LC Object Service Deployment Guide#

The Object Service should be run “close” to your data. Often, that means running the Object Service on your local machine where your ML notebooks and training is running. However, there are scenarios where it is preferable to run the Object Service on a remote machine, such as a compute server or when using hosted notebook solutions such as Google Colab or AWS SageMaker.

In such scenarios, the Object Service will be running on a different computer than where the end-user is accessing the Dashboard in a browser, and the Dashboard needs to know the address of the Object Service. The following flowchart should help you identify your scenario. In this guide we call this machine the compute server and the computer with your browser is called localhost.

../../_images/object-service-deployment-decision-tree.png

Decision tree to determine deployment scenario.#

Scenario A: Local deployment#

This scenario applies when your training runs on your computer, which is a typical scenario for an individual data-scientist working on their own.

In this scenario, the following steps should be sufficient:

  1. Start the Object Service using 3lc service on the command line.

  2. Open your browser and go to dashboard.3lc.ai.

Scenario B: SSH Tunnel#

This scenario applies when the training happens on a different computer than where you are accessing your browser. For example, if your department has a compute server, or if you are using virtual machine instances in the cloud.

If you have access to the compute server using SSH, it it possible to use port forwarding so that the Dashboard connects to localhost and isn’t aware that the Object Service is running on the compute server.

Using OpenSSH the following command will forward port 5015 on the compute server to port 5015 on _localhost. This should allow for the default Dashboard at dashboard.3lc.ai to connect directly.

ssh -L 5015:<compute server ip>:5015 user@<compute server> 

The command line might have to be tweaked if using another SSH implementation.

After the SSH tunnel has been established, simply go to dashboard.3lc.ai to access the object service.

Scenario C: Hosted Reverse Proxy#

In scenarios where the Object Service will be running on a computer that is only available over the public internet, we provide integration with the NGrok reverse proxy service. By using this integration, the Object Service will be assigned a temporary public URL for for the Dashboard to use. Even if this URL is public, 3LC has built in authentication to ensure that the Dashboard and Object Service are logged into the same account.

Examples of services where this is useful include hosted Jupyter Notebook environments such as Google Colab and AWS SageMaker. Please ensure that you are allowed to use reverse proxies with such hosted notebook services prior to using Object Service with NGrok.

In order to use the NGrok integration, the following steps are required:

Install package#

Ensure that the 3lc[pyngrok] package is installed on the computer where you want to run the Object Service:

pip install 3lc[pyngrok]

Sign up for NGrok account#

Sign up for an NGrok account at ngrok.com. NGrok offers both free and paid accounts, depending on your usage needs.

Important

NGrok is a third-party service which requires the user to sign up separately and which has its own terms and conditions. Please review these carefully before using NGrok.

Copy NGrok Authtoken#

Log in to your NGrok account and copy the Auth Token from the NGrok Dashboard.

The NGrok Dashboard where the Authtoken can be accessed.

Set NGROK_TOKEN environment variable#

On the computer where you will launch the Object Service, set the NGROK_TOKEN environment variable. This must be set prior to launching the Object Service.

export NGROK_TOKEN=<ngrok token>
set NGROK_TOKEN=<ngrok token>

Launch Object Service#

Launch the Object Service using the --ngrok option.

3lc service --ngrok

The Object Service will now set up an NGrok endpoint for connecting to the Object Service, and it will output a URL that can be used to launch a Dashboard that can connect through the endpoint:

3LC Object Service - Version: 2.3, PID: 1408 (Use Ctrl-C to exit.)
Platform: Windows 10.0.22631 (AMD64), Python: 3.11.7
Object Service URL: http://127.0.0.1:5015
Dashboard URL: https://dashboard.3lc.ai?object_service=https%3A%2F%2Fdb8c-62-92-224-122.ngrok-free.app

Scenario D: SSL Configuration#

Advanced

This scenario can be complex, and may require the cooperation and expertise of your system administrator to set up correctly.

The 3LC Dashboard is served over HTTPS. Modern browsers do not allow access to HTTP endpoints from pages served over HTTPS. Therefore, it is necessary to configure the Object Service to be accessible over HTTPS as well.

However, the Object Service application itself cannot host HTTPS services. Thus, a service is required to provide a SSL connection and forward the traffic to the Object Service. This is typically achieved using a reverse proxy or a load balancer, ensuring that the 3LC Dashboard accesses the Object Service via HTTPS, while the Object Service continues to operate over HTTP.

The method of SSL offloading or termination can vary depending on the deployment scenario and requirements. Below are a few common approaches:

  1. Deploy on a Virtual Machine:

    Most cloud providers offer load balancing, certificate integration, and DNS services. Refer to your cloud provider’s documentation for details on setting up SSL termination for a virtual machine instance. Example: On AWS, a Load Balancer can be configured to use a certificate from the AWS Certificate Manager to provide SSL Termination, and set the EC2 instance as a target.

  2. Deploy on a Kubernetes Cluster:

    Kubernetes supports ingress controllers that handle SSL termination, which can utilize certificates from Let’s Encrypt, self-signed certificates, or a cloud provider’s certificate solution. Example: The ALB Ingress Controller offers integration with AWS Certificate Manager.

  3. Deploy on On-Premise Infrastructure:

    Many enterprises already have a load balancer with SSL termination capabilities. Consult your specific vendor’s documentation for SSL configuration and forwarding instructions. Example: Nginx can be installed on the same machine that hosts the Object Service to manage SSL Termination and forward traffic. Refer to the Nginx documentation for setup guidance.

Dashboard Configuration#

To ensure seamless integration between your Dashboard app and the Object Service, it is crucial to properly configure the network settings. By default, the Dashboard app expects the Object Service to be accessible at http://localhost:5015. However, when running the Object Service on a remote server, you will need to specify the correct address for the Object Service to the Dashboard.

Warning

Modern browsers do not allow access to HTTP endpoints from HTTPS pages. This is called accessing mixed content. When the Object Service is on a compute server and accessed via a browser from a remote location, the browser must be configured to allow mixed content or the Object Service must have SSL offloaded in some way through configuration of the compute server.

Settings Dialog#

The URL of the Object Service can be set through the Dashboards settings dialog by changing the Object Service URL under Connection.

The settings dialog in the dashboard. It can be used to specify the URL of the object service.

Query Parameter#

As an alternative to using the settings dialog, it is possible to set the URL of the Object Service as a query parameter to the Dashboard. This approach is best suited for scripting or applications that automate the launching of the Object Service.

The format of the query parameter is:

https://dashboard.3lc.ai?object_service=<url>

Note that the value of the query-parameter must be URL encoded. This applies to characters that typically appears in URLs, such as / and :. An example of an encoded URL is

https://dashboard.3lc.ai?object_service=https%3A%2F%2F192.168.0.10%3A5015