OpenVidu Pro S3 Upload

Hi,

I have setup OpenVidu Pro and also enabled the S3 upload, but when it tries to upload it, I see the below logs:

[ERROR] 2020-12-09 10:51:51,960 [http-nio-0.0.0.0-5443-exec-1] io.openvidu.server.pro.recording.service.RecordingManagerUtilsS3 - AmazonServiceException when listing objects from bucket BUCKET_NAME. S3 couldn't process the call: The provided token has expired.
[ERROR] 2020-12-09 10:46:23,381 [Thread-19] io.openvidu.server.pro.recording.S3RecordingUploader - Error uploading recording 2f9d7ac1-fdbf-432b-9d23-1e6abfe99259-1 to S3 bucket path BUCKET_NAME/: The provided token has expired. (Service: Amazon S3; Status Code: 400; Error Code: ExpiredToken; Request ID: BV3Z7S0H7KFH5MCW; S3 Extended Request ID: iqi5QS2ZZ0phe/bMcsH0BCUyBLi6ZLOC3XF7Cll5ycXZNUztcB+jsferWLh4jbKC3yOQLi66C04=; Proxy: null)

I have AWS Multi-account setup with SSO and EC2 IAM role as 1 hour of session duration.

Thanks

You should use a token without expiration.

can you please look into my issue and help

thanks

Ok, but AWS Control tower with SSO setup account does not have an option of “No expiration”. It only allows a max of 12 hours for session token duration for an IAM role

Can you at least have a fixed Role in the EC2 instance? Or it has also expiration?

If everything has expiration time… We can’t do more

Regards,
Carlos

There no issue with role it is fixed to EC2. For now, I have executed following command and restarted the OpenVidu service. It seems to be working now but will need check what happens after session expiration.

curl http://169.254.169.254/latest/meta-data/iam/security-credentials/Role_name

@cruizba Just FYI, not sure about how you guys have implemented it but I found this in AWS support questions:

Thanks,
Tushar

You can edit the policy attached to your EC2 instance and give it access to your S3 bucket. You just need to add these rules into your policy:

         {
            "Action": [
                "s3:DeleteObject",
                "s3:GetObject",
                "s3:PutObject"
            ],
            "Resource": [
                "arn:aws:s3:::<YOUR_S3_BUCKET_NAME>/*"
            ],
            "Effect": "Allow"
        },
        {
            "Action": [
                "s3:ListBucket",
                "s3:GetBucketLocation"
            ],
            "Resource": [
                "arn:aws:s3:::<YOUR_S3_BUCKET_NAME>"
            ],
            "Effect": "Allow"
        },
        {
            "Action": [
                "s3:ListAllMyBuckets"
            ],
            "Resource": "arn:aws:s3:::",
            "Effect": "Allow"
        }

Replace <YOUR_S3_BUCKET_NAME> with your bucket name.
After editing the policy, delete the variables in your .env OPENVIDU_PRO_AWS_ACCESS_KEY and OPENVIDU_PRO_AWS_SECRET_KEY and restart OpenVidu. It will not be necessary with the policy.

Hi
In my case, i could upload recording files from openvidu ec2 to S3 bucket successfully.
I gave openvidu ec2 instance the ‘Iam role’. The role is as follows
and i write my bucket name like this on .env , without ‘s3://’.
OPENVIDU_PRO_AWS_S3_BUCKET=my_buckey_name

{
“Version”: “2012-10-17”,
“Statement”: [
{
“Sid”: “VisualEditor0”,
“Effect”: “Allow”,
“Action”: [
“s3:GetLifecycleConfiguration”,
“s3:GetBucketTagging”,
“s3:GetInventoryConfiguration”,
“s3:GetObjectVersionTagging”,
“s3:ListBucketVersions”,
“s3:GetBucketLogging”,
“s3:ListBucket”,
“s3:GetAccelerateConfiguration”,
“s3:GetBucketPolicy”,
“s3:GetObjectVersionTorrent”,
“s3:GetObjectAcl”,
“s3:GetEncryptionConfiguration”,
“s3:GetBucketObjectLockConfiguration”,
“s3:GetBucketRequestPayment”,
“s3:GetAccessPointPolicyStatus”,
“s3:GetObjectVersionAcl”,
“s3:GetObjectTagging”,
“s3:GetMetricsConfiguration”,
“s3:GetBucketOwnershipControls”,
“s3:DeleteObject”,
“s3:GetBucketPublicAccessBlock”,
“s3:GetBucketPolicyStatus”,
“s3:ListBucketMultipartUploads”,
“s3:GetObjectRetention”,
“s3:GetBucketWebsite”,
“s3:GetJobTagging”,
“s3:GetBucketVersioning”,
“s3:GetBucketAcl”,
“s3:GetObjectLegalHold”,
“s3:GetBucketNotification”,
“s3:GetReplicationConfiguration”,
“s3:ListMultipartUploadParts”,
“s3:PutObject”,
“s3:GetObject”,
“s3:GetObjectTorrent”,
“s3:DescribeJob”,
“s3:GetBucketCORS”,
“s3:GetAnalyticsConfiguration”,
“s3:GetObjectVersionForReplication”,
“s3:GetBucketLocation”,
“s3:GetAccessPointPolicy”,
“s3:GetObjectVersion”
],
“Resource”: [
“arn:aws:s3:::<YOUR_S3_BUCKET_NAME>”
]
},
{
“Effect”: “Allow”,
“Action”: [
“s3:GetBucketLocation”,
“s3:ListAllMyBuckets”
],
“Resource”: “arn:aws:s3:::"
},
{
“Sid”: “VisualEditor1”,
“Effect”: “Allow”,
“Action”: [
“s3:GetAccessPoint”,
“s3:GetAccountPublicAccessBlock”,
“s3:ListAllMyBuckets”,
“s3:ListAccessPoints”,
“s3:ListJobs”
],
“Resource”: "

},
{
“Effect”: “Allow”,
“Action”: [
“s3:ListBucket”
],
“Resource”: [
“arn:aws:s3:::<YOUR_S3_BUCKET_NAME>”
]
},
{
“Effect”: “Allow”,
“Action”: [
“s3:PutObject”,
“s3:GetObject”,
“s3:DeleteObject”
],
“Resource”: [
“arn:aws:s3:::<YOUR_S3_BUCKET_NAME>/*”
]
}
]
}