Struggling with recording OpenVidu 3.1.0

Hi all,
we are trying to initiate the call recording process using the LIVEKIT PHP SDK.
Our backend is built in PHP and we have installed the appropriate library with the following command : " composer require agence104/livekit-server-sdk ".
This installed the version: “agence104/livekit-server-sdk”: “^1.2” which created the current file composer.json :

{
    "name": "agence104/livekit-server-sdk",
    "description": "Server-side SDK for LiveKit.",
    "type": "library",
    "license": "MIT",
    "version": "1.2.5",
    "authors": [
        {
            "name": "Agence 10-4",
            "homepage": "https://www.agence104.com"
        }
    ],
    "require": {
        "php": "^8.0",
        "firebase/php-jwt": "^6.8",
        "twirp/twirp": "^0.9.1",
        "google/protobuf": "^3.23",
        "guzzlehttp/guzzle": "^6.3|^7.0",
        "guzzlehttp/psr7": "^1.6.1|^2.0"
    },
    "autoload": {
        "psr-4": {
            "Agence104\\LiveKit\\": "src",
            "GPBMetadata\\": "src/proto/GPBMetadata",
            "Livekit\\": "src/proto/Livekit"
        }
    },
    "require-dev": {
        "phpunit/phpunit": "^10.4"
    }
}

We need the call recordings to be saved to an AWS bucket with high availability, which has been correctly set up as shown in the following image.

We reviewed the example on your site, but it appears to be available only in Node.js ( Basic Recording Tutorial - openVidu ).

We want to know:

  • the library we installed is correct ?
  • there are some examples or docs we can see to understand how to do the recording part of the call.

We would greatly appreciate if you could provide a specific example or a guide to correctly implement this part.

Greetings
Mattia

Hello,

Yes, library agence104/livekit-server-sdk-php is the official (community contributed) SDK for managing LiveKit Server APIs.
We don’t currently have tutorials showing the use of the Egress API using the PHP library, but the application server PHP tutorial available in our docs can serve as a very nice starting point. Adding egress capabilities to it should be quite direct.
What kind of problem are you experiencing when trying to start an egress using an S3 bucket as destination? Can you provide further details?

Hi Pablo,

We are trying to understand how to implement the API call between the web conference and our endpoint. In particular we are trying to understand how to start recordings reading the documentation at this url Recording and composition | LiveKit Docs , however as I said before there are no PHP tutorials so it is a little bit difficult trying to understand how it works.

We have already looked at the PHP tutorial but it did not explain how to start or stop a recording so we are stuck. We are trying different ways to start a recording but no one seems working…

<?php

require 'vendor/autoload.php';

use Agence104\LiveKit\EgressServiceClient;
use Agence104\LiveKit\EncodedFileOutput;

$host = '';
$apiKey = '';
$apiSecret = '';

$egressClient = new EgressServiceClient($host, $apiKey, $apiSecret);
$s3Output = new EncodedFileOutput([
    'filepath' => '',
    's3' => [
        'access_key' => '',
        'secret' => '',
        'region' => '',
        'bucket' => '',
    ]
]);

$response = $egressClient->startRoomCompositeEgress('session-name', 'grid', $s3Output);

echo '<pre>'; var_dump($response); exit;

In particular the PHP compiler says Agence104\LiveKit\EncodedFileOutput is not found.

Greetings,
Mattia

Hello Mattia,

Your code snippet is not importing the PHP classes in the right way. It should be something like this:

use Agence104\LiveKit\EgressServiceClient;
use Agence104\LiveKit\EncodedOutputs;
use Livekit\EncodedFileOutput;
use Livekit\S3Upload;

...

// Use the EgressServiceClient to start a recording
$egressClient = new EgressServiceClient($LIVEKIT_URL, $LIVEKIT_API_KEY, $LIVEKIT_API_SECRET);
// Define an S3Upload
$s3Upload = new S3Upload();
$s3Upload->setEndpoint($_ENV["S3_ENDPOINT"]);
$s3Upload->setAccessKey($_ENV["S3_ACCESS_KEY"]);
$s3Upload->setSecret($_ENV["S3_SECRET_KEY"]);
$s3Upload->setBucket("openvidu");
// Define an EncodedFileOutput
$output = new EncodedFileOutput();
$output->setS3($s3Upload);
// Define the EncodedOutputs
$encodedOutputs = new EncodedOutputs();
$encodedOutputs->setFile($output);
// Perform the request to start the egress
$egressResponse = $egressClient->startRoomCompositeEgress($roomName, "grid", $encodedOutputs);

Notice that some classes are the ones generated automatically by the livekit protocol (with protobuf), and they should be imported from Livekit\ package directly, not from Agence104\LiveKit\. IMHO, using an IDE with a good autocomplete system or even an AI copilot can help you overcome this little details very easily.

Best regards!

Hi Pablo,
I think I’m on the same situation of Mattia (I also got the same deployment), however I got this response from the library:

{“error”:“Failed to start egress: no response from servers”}

What does it means? Does this indicate that tere is an error in the parameters passed to any method?

Another question, I noticed that for HA deployment the configuration create two different buckets:

  • openvidu-appdata-****
  • openvidu-clusterdata-****

In my PHP script I setted up the S3 piece of code to connect to the first one, is that correct?

Thank you in advance.
Greetings, Matteo.

Hello Matteo,

Error Failed to start egress: no response from servers usually means that your deployment is not healthy. It can be that your egress container is not properly running in your Media Nodes, or that your Media Nodes exceeds the CPU load required for starting a new egress. Please, check the logs of your Media Node.

Yes, the S3 bucket for recordings is the first one openvidu-appdata-****. The other bucket is only for shared configuration, metrics and historical data.

Best regards.

Hi Pablo,
I checked my media node containers, but seems to be all ok, right?

I’m trying to start a recording for a session with one user connected and a Media Node with 4CPU.

The editor doesn’t allow to load .log or .txt file, so I leave you the file with the logs at this link.
I tried to understand what wasn’t working, but I didn’t get it.

It also seems that a certain point the recording was started, but in my bucket I didn’t saw anything.

Greetings, Matteo.

I don’t see anything wrong in the log file of your egress container.
It would be helpful to see the full log of containers openvidu and egress at the same time, to see the exact flow of events and messages just after you try to start the recording.
In case it is a problem of CPU resources, you can try tweaking these configuration properties of egress.yaml file:

cpu_cost: # optionally override cpu cost estimation, used when accepting or denying requests
  room_composite_cpu_cost: 3.0
  web_cpu_cost: 3.0
  track_composite_cpu_cost: 2.0
  track_cpu_cost: 1.0

Hi Pablo,
I leave you the screenshot about my egress.yaml file founded in the media node itself.
I don’t know why the access_key and secret about the S3 storage are empty, however I think this could effect the error. Could it be?

What do you think about the cpu_cost configuration? I didn’t change anything yet, this is the default configuration setted up by the deployment process.

My HA deployment is set up using scalable media nodes. In case the system needs to create a new one, does it copy the configuration of the first media node or not?

I also leave you the link to the egress.txt and openvidu.txt log files, so you can take a look:

Greetings, Matteo.

The file upload configuration can be set up in your egress.yaml file, but it also can be overriden when calling the egress start operation. This is exactly what this example is doing. So as long as your S3 credentials are valid and you set them up in 1) your egress.yaml file or 2) your start egress API call, then this shouldn’t be a problem.

Answering your HA deployment question: yes, new Media Nodes will always start with the same active configuration as the previously existing Media Nodes.

I still don’t see anything wrong in your log files. The egresses seem to be completing just right. You can see that in both your openvidu logs:

2025-05-22T06:51:11.819Z        INFO    openvidu.webhook        webhook/url_notifier.go:124     sent webhook    {"event": "egress_ended", "id": "EV_8gxDXeVXLgTH", "webhookTime": 1747896671, "egressID": "EG_YXn5m28K45YZ", "status": "EGRESS_COMPLETE", "url": "http://***.***.***.***:6080/livekit/webhook", "queueDuration": "83.483µs", "sendDuration": "8.475753ms"}

and your egress logs:

2025-05-22T06:51:11.811Z        INFO    egress  info/io.go:178  egress_complete {"nodeID": "NE_N3zzUBL89vvF", "clusterID": "", "egressID": "EG_YXn5m28K45YZ", "requestType": "room_composite", "outputType": "file", "error": "", "code": 0, "details": "End reason: Source closed"}

So this must be something wrong with your S3 configuration. To confirm this theory, please check path /opt/openvidu/data/egress_data/home/egress/backup_storage in the Media Node hosting your egress. There you should find the completed egress files if the S3 upload failed for any reason.

You can also do this to increase the logging level to understand why is the S3 uploading process failing:

  1. Increase the debug level of the egress container in the egress.yaml config file at /opt/openvidu/config, adding this:
logging:
    level: debug

And modify the S3 part of the configuration adding this:

storage:
    s3:
        ... other config...
        aws_log_level: LogDebugWithRequestRetries
  1. Restart your OpenVidu deployment so this changes take effect, and repeat your test.

Also, make sure to carefully read this section of the documentation: Configuring external S3 for OpenVidu recordings - openVidu

There you have instructions to properly setup recording storage in an external S3 bucket (a bucket outside of the OpenVidu deployment), if that’s what you are trying to do.

Hi Pablo,
I think I’m making progress with the recording (inside the bucket now I see the recording), however I would like to get some other clarification.
This is my PHP code:

<?php

use Agence104\LiveKit\EgressServiceClient;
use Agence104\LiveKit\EncodedOutputs;
use Livekit\EncodedFileOutput;
use Livekit\S3Upload;

$egressClient = new EgressServiceClient($sLiveKitAPIURL, $sLiveKitAPIKey, $sLiveKitAPISecret);

$s3Upload = new S3Upload();
$s3Upload -> setEndpoint('https://s3.eu-south-1.amazonaws.com/');
$s3Upload -> setAccessKey('***');
$s3Upload -> setSecret('***');
$s3Upload -> setBucket('openvidu-appdata-***');

$output = new EncodedFileOutput();
$output -> setS3($s3Upload);

$encodedOutputs = new EncodedOutputs();
$encodedOutputs -> setFile($output);

try
{
    $egressResponse = $egressClient -> startRoomCompositeEgress($aData['eventCode'], 'grid', $encodedOutputs);

    echo json_encode(['egress' => $egressResponse -> getEgressId(), 'error' => '']);
    exit;
}
catch(Exception $e)
{
    echo json_encode(['egress' => '', 'error' => 'Failed to start egress: ' . $e -> getMessage()]);
    exit;
}

?>

I would like to clarify whether I only need to retrieve the egressId, or if I also need to return additional values. I analyzed the request made by your OVCall demo, and the response seems quite different from mine.

Specifically, I’d like to understand if, after obtaining the egressId, I need to manually indicate that the recording has started. It seems like something might be missing, because the button on the panel doesn’t switch from “Start recording” to “Stop recording,” and the red container below the session name doesn’t appear.

I recall that in version 2.29, the red container would automatically appear once the recording started.

Greetings, Matteo.

Hi Pablo,

I’m making progress with the recording – I can now see the file correctly in the S3 bucket, and the start, stop, and list functions seem to be working as expected.

However, I’m still experiencing an issue :
after starting the recording, the red timer icon below the session name doesn’t appear.

In the previous version 2.29, it would automatically update and show the red container as soon as the recording started. Now, even though the recording is clearly starting (confirmed by checking the S3 bucket and the generated egressId), the interface does not visually reflect that the recording is active.

I would like to clarify:

  • Is it enough to return only the egressId from the startRoomCompositeEgress() call?
  • Should additional values (such as status, stream info, etc.) be included so that the interface recognizes the recording state?
  • Is there a WebSocket event or polling mechanism that the interface expects to update itself accordingly?

Best regards,
Matteo