Cloud Storage¶
Shaka Streamer can output to an HTTP/HTTPS server or to cloud storage.
HTTP or HTTPS URLs will be passed directly to Shaka Packager, which will make PUT requests to the HTTP/HTTPS server to write output files. The URL you pass will be a base for the URLs Packager writes to. For example, if you pass https://localhost:8080/foo/bar/, Packager would make a PUT request to https://localhost:8080/foo/bar/dash.mpd to write the manifest (with default settings).
Cloud storage URLs can be either Google Cloud Storage URLs (beginning with gs://), Amazon S3 URLs (beginning with s3://), or Azure Blob Storage URLs (beginning with azure://). Like the HTTP support described above, these are a base URL. If you ask for output to gs://foo/bar/, Streamer will write to gs://foo/bar/dash.mpd (with default settings).
Cloud storage output uses the storage provider’s Python libraries. Find more details on setup and authentication below.
Google Cloud Storage Setup¶
Install the Python module if you haven’t yet:
python3 -m pip install google-cloud-storage
To use the default authentication, you will need default application
credentials installed. On Linux, these live in
~/.config/gcloud/application_default_credentials.json.
The easiest way to install default credentials is through the Google Cloud SDK. See https://cloud.google.com/sdk/docs/install-sdk to install the SDK. Then run:
gcloud init
gcloud auth application-default login
Follow the instructions given to you by gcloud to initialize the environment and login.
Example command-line for live streaming to Google Cloud Storage:
python3 shaka-streamer \
-i config_files/input_looped_file_config.yaml \
-p config_files/pipeline_live_config.yaml \
-o gs://my_gcs_bucket/folder/
Amazon S3 Setup¶
Install the Python module if you haven’t yet:
python3 -m pip install boto3
To authenticate to Amazon S3, you can either add credentials to your boto config file or login interactively using the AWS CLI.
aws configure
Example command-line for live streaming to Amazon S3:
python3 shaka-streamer \
-i config_files/input_looped_file_config.yaml \
-p config_files/pipeline_live_config.yaml \
-o s3://my_s3_bucket/folder/
Azure Blob Storage Setup¶
Install the Python modules if you haven’t yet:
python3 -m pip install azure-storage-blob azure-identity
Azure Blob Storage support uses append blobs for efficient streaming uploads, making it ideal for live streaming scenarios where data is written sequentially. Authentication is handled by Azure’s DefaultAzureCredential, which automatically tries multiple authentication methods in order.
The most common authentication methods are:
Azure CLI: Login using
az login(recommended for development)Managed Identity: Automatic when running on Azure resources
Service Principal: Set
AZURE_CLIENT_ID,AZURE_CLIENT_SECRET, andAZURE_TENANT_IDenvironment variablesInteractive Browser: Fallback authentication method
The Azure URL format is: azure://storageaccount.blob.core.windows.net/container/path/
Example command-line for live streaming to Azure Blob Storage:
python3 shaka-streamer \
-i config_files/input_looped_file_config.yaml \
-p config_files/pipeline_live_config.yaml \
-o azure://mystorageaccount.blob.core.windows.net/mycontainer/folder/