On Premise Deployment 2.15 - unable to get Media Node to start

Hello,

New to Openvidu Pro deployment (we have been working with Openvidu CE until the license became available).

Deploying within Azure on newly created Ubuntu servers and have configured associated NSG rules as per the documentation.
When it comes to the media node deployment, we are having difficulties getting the Media node to start.
Running sudo ./media_node start, we get initial indications to show that it is starting, but then can’t seem to see an identified process within the server to show that it is running. Checking with netstat and lsof, we can’t see anything running on port 8888.

Could you please advise what we can use to check that the Kurento Media Server has successfully started?

We are unsure if the NSG rules are causing a problem, but they have been exposed appropriately within the NSG covering the associated Subnets and we even moved the KMS into the same subnet as the Open Vidu Pro Node, but didn’t make a difference.

If you can offer advice, would be very much appreciated.

Thanks,

Ian.

Hi @IanPounder, you can’t see any Kurento Media Server running in your Media Node because OpenVidu Pro Node is the one who deploys this container using the media-node-controller service. When the media-node-controller is running and port 3000 is available, you must run OpenVidu Pro and then add them through the inspector, or just by adding the ws://NEW_MEDIA_NODE_IP:8888/kurento url in the KMS_URIS configuration parameter in .env of your OpenVidu Pro Node.

To clarify the process:

  1. Open all ports documented
  2. Go to your Media Node and run: ./media_node start in /opt/kms
  3. Go to your OpenVidu Pro Node. Then you can configure your Media Node just by adding ws://NEW_MEDIA_NODE_IP:8888/kurento in KMS_URIS and run ./openvidu start in /opt/openvidu. Or you can just run ./openvidu start and add the nodes with the OpenVidu Inspector

Best Regards,
Carlos

Hi @cruizba,

So we have done this and when we update the .env for openvidu, we have KMS_URS confirgured with KMS_URIS=[“ws://172.16.4.5:8888/kurento”] - 172.16.4.5 is the private IP of the Media Server node.

When we execute ./openvidu start, we get the following back:
openvidu-server_1 | [INFO] 2020-09-10 12:52:15,954 [nioEventLoopGroup-3-1] org.kurento.jsonrpc.client.JsonRpcClientNettyWebSocket - [KurentoClient] Initiating new Netty channel. Will create new handler too!
openvidu-server_1 | [WARN] 2020-09-10 12:52:15,957 [main] org.kurento.jsonrpc.client.JsonRpcClientNettyWebSocket - [KurentoClient] Trying to close a JsonRpcClientNettyWebSocket with channel == null
openvidu-server_1 | [ERROR] 2020-09-10 12:52:15,963 [main] io.openvidu.server.kurento.kms.KmsManager - OpenVidu Server couldn’t connect to KMS with uri ws://172.16.4.5:8888/kurento
openvidu-server_1 | [ERROR] 2020-09-10 12:52:15,964 [main] io.openvidu.server.kurento.kms.KmsManager - None of the KMSs in [ws://172.16.4.5:8888/kurento] are within reach of OpenVidu Server
openvidu-server_1 | [ERROR] 2020-09-10 12:52:15,964 [main] io.openvidu.server.kurento.kms.KmsManager - Shutting down OpenVidu Server

The NSG rule specifically related to 8888 is set as

Port 8888 Protocol TCP Source 172.16.4.0/26 Destination 172.16.4.0/26 Action Allow

open Vidu server and Media Node are within the same subnet (located within the above IP Range).

Any advice?

Thaks,

Ian.

Might also be worth pointing out that Port 3000 is also configured to allow traffic within the subnet range mentioned above.

Thanks,

Ian.

Please, could you paste the entire openvidu log? Also, in your Media Node, what is the output of docker ps?

Regards,
Carlos

Log information:

openvidu-server_1 | =======================================
openvidu-server_1 | = LAUNCH OPENVIDU-SERVER =
openvidu-server_1 | =======================================
openvidu-server_1 |
openvidu-server_1 | ______________________________________________________________
openvidu-server_1 | ____ __ ___ _
openvidu-server_1 | / __ \ \ \ / () | |
openvidu-server_1 | | | | |
__ ___ _ \ \ / / _ | | _ ___
openvidu-server_1 | | | | | ’
\ / _ \ ’
\ / / | |/ ` | | | | | _ _ _ ___
openvidu-server_1 | | |__| | |
) | / | | \ / | | (| | || | | / '/ _
openvidu-server_1 | _
/| ./ _|| ||/ ||_,|_,| || || __/
openvidu-server_1 | | |
openvidu-server_1 | |
| version 2.15.1
openvidu-server_1 | ______________________________________________________________
openvidu-server_1 |
openvidu-server_1 | [INFO] 2020-09-10 12:52:10,662 [main] io.openvidu.server.pro.OpenViduServerPro - Starting OpenViduServerPro on <> with PID 134 (/opt/openvidu/openvidu-server.jar started by root in /opt/openvidu)
openvidu-server_1 | [INFO] 2020-09-10 12:52:10,672 [main] io.openvidu.server.pro.OpenViduServerPro - No active profile set, falling back to default profiles: default
openvidu-server_1 | [INFO] 2020-09-10 12:52:11,052 [main] io.openvidu.server.config.OpenviduConfig - Configuration properties read from file /opt/openvidu/.env
openvidu-server_1 | [INFO] 2020-09-10 12:52:11,163 [main] io.openvidu.server.pro.OpenViduServerPro - Started OpenViduServerPro in 2.002 seconds (JVM running for 2.702)
openvidu-server_1 | [INFO] 2020-09-10 12:52:11,166 [main] io.openvidu.server.OpenViduServer -
openvidu-server_1 |
openvidu-server_1 |
openvidu-server_1 | Configuration properties
openvidu-server_1 | ------------------------
openvidu-server_1 |
openvidu-server_1 | * CERTIFICATE_TYPE=letsencrypt
openvidu-server_1 | * DOMAIN_OR_PUBLIC_IP=xxxxxxxxxx.uksouth.cloudapp.azure.com
openvidu-server_1 | * HTTPS_PORT=443
openvidu-server_1 | * KMS_URIS=[“ws://172.16.4.5:8888/kurento”]
openvidu-server_1 | * OPENVIDU_CDR=false
openvidu-server_1 | * OPENVIDU_CDR_PATH=/opt/openvidu/cdr
openvidu-server_1 | * OPENVIDU_PRO_CHECK_DIND_UPDATES=true
openvidu-server_1 | * OPENVIDU_PRO_CLUSTER=true
openvidu-server_1 | * OPENVIDU_PRO_CLUSTER_AUTOSCALING=false
openvidu-server_1 | * OPENVIDU_PRO_CLUSTER_AUTOSCALING_INTERVAL=10
openvidu-server_1 | * OPENVIDU_PRO_CLUSTER_AUTOSCALING_MAX_LOAD=70
openvidu-server_1 | * OPENVIDU_PRO_CLUSTER_AUTOSCALING_MAX_NODES=2
openvidu-server_1 | * OPENVIDU_PRO_CLUSTER_AUTOSCALING_MIN_LOAD=30
openvidu-server_1 | * OPENVIDU_PRO_CLUSTER_AUTOSCALING_MIN_NODES=1
openvidu-server_1 | * OPENVIDU_PRO_CLUSTER_ENVIRONMENT=on_premise
openvidu-server_1 | * OPENVIDU_PRO_CLUSTER_ID=
openvidu-server_1 | * OPENVIDU_PRO_CLUSTER_LOAD_INTERVAL=3
openvidu-server_1 | * OPENVIDU_PRO_CLUSTER_MEDIA_NODES=1
openvidu-server_1 | * OPENVIDU_PRO_CLUSTER_MODE=auto
openvidu-server_1 | * OPENVIDU_PRO_CLUSTER_PATH=/opt/openvidu/cluster
openvidu-server_1 | * OPENVIDU_PRO_CLUSTER_TEST=false
openvidu-server_1 | * OPENVIDU_PRO_ELASTICSEARCH_MAX_DAYS_DELETE=0
openvidu-server_1 | * OPENVIDU_PRO_LICENSE=XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXx
openvidu-server_1 | * OPENVIDU_PRO_PRIVATE_IP=
openvidu-server_1 | * OPENVIDU_PRO_STATS_MONITORING_INTERVAL=30
openvidu-server_1 | * OPENVIDU_PRO_STATS_WEBRTC_INTERVAL=30
openvidu-server_1 | * OPENVIDU_RECORDING=true
openvidu-server_1 | * OPENVIDU_RECORDING_AUTOSTOP_TIMEOUT=240
openvidu-server_1 | * OPENVIDU_RECORDING_COMPOSED_URL=
openvidu-server_1 | * OPENVIDU_RECORDING_CUSTOM_LAYOUT=/opt/openvidu/custom-layout
openvidu-server_1 | * OPENVIDU_RECORDING_DEBUG=true
openvidu-server_1 | * OPENVIDU_RECORDING_NOTIFICATION=publisher_moderator
openvidu-server_1 | * OPENVIDU_RECORDING_PATH=/opt/openvidu/recordings
openvidu-server_1 | * OPENVIDU_RECORDING_PUBLIC_ACCESS=false
openvidu-server_1 | * OPENVIDU_RECORDING_VERSION=2.15.0
openvidu-server_1 | * OPENVIDU_SECRET=ABC!123XYZ456
openvidu-server_1 | * OPENVIDU_SESSIONS_GARBAGE_INTERVAL=900
openvidu-server_1 | * OPENVIDU_SESSIONS_GARBAGE_THRESHOLD=3600
openvidu-server_1 | * OPENVIDU_STREAMS_VIDEO_MAX_RECV_BANDWIDTH=1000
openvidu-server_1 | * OPENVIDU_STREAMS_VIDEO_MAX_SEND_BANDWIDTH=1000
openvidu-server_1 | * OPENVIDU_STREAMS_VIDEO_MIN_RECV_BANDWIDTH=200
openvidu-server_1 | * OPENVIDU_STREAMS_VIDEO_MIN_SEND_BANDWIDTH=200
openvidu-server_1 | * OPENVIDU_WEBHOOK=false
openvidu-server_1 | * OPENVIDU_WEBHOOK_ENDPOINT=
openvidu-server_1 | * OPENVIDU_WEBHOOK_EVENTS=[sessionCreated,sessionDestroyed,participantJoined,participantLeft,webrtcConnectionCreated,webrtcConnectionDestroyed,recordingStatusChanged,filterEventDispatched,mediaNodeStatusChanged]
openvidu-server_1 | * OPENVIDU_WEBHOOK_HEADERS=[]
openvidu-server_1 |
openvidu-server_1 |
openvidu-server_1 |
openvidu-server_1 | [WARN] 2020-09-10 12:52:11,168 [main] io.openvidu.server.pro.OpenViduServerPro - You have set property server.port (or SERVER_PORT). This will serve OpenVidu Server Pro on your host at port 5443. But property HTTPS_PORT (443) still configures the port that should be used to connect to OpenVidu Server from outside. Bear this in mind when configuring a proxy in front of OpenVidu Server
openvidu-server_1 | [INFO] 2020-09-10 12:52:11,319 [main] io.openvidu.server.pro.OpenViduServerPro - Starting OpenViduServerPro on ResponseEyeOVPNode1 with PID 134 (/opt/openvidu/openvidu-server.jar started by root in /opt/openvidu)
openvidu-server_1 | [INFO] 2020-09-10 12:52:11,322 [main] io.openvidu.server.pro.OpenViduServerPro - No active profile set, falling back to default profiles: default
openvidu-server_1 | [INFO] 2020-09-10 12:52:12,877 [main] io.openvidu.server.config.OpenviduConfig - Configuration properties read from file /opt/openvidu/.env
openvidu-server_1 | [INFO] 2020-09-10 12:52:13,705 [main] org.springframework.boot.web.embedded.tomcat.TomcatWebServer - Tomcat initialized with port(s): 5443 (http)
openvidu-server_1 | [INFO] 2020-09-10 12:52:13,730 [main] org.apache.catalina.core.StandardService - Starting service [Tomcat]
openvidu-server_1 | [INFO] 2020-09-10 12:52:13,731 [main] org.apache.catalina.core.StandardEngine - Starting Servlet engine: [Apache Tomcat/9.0.30]
openvidu-server_1 | [INFO] 2020-09-10 12:52:13,825 [main] org.apache.catalina.core.ContainerBase.[Tomcat].[localhost].[/] - Initializing Spring embedded WebApplicationContext
openvidu-server_1 | [INFO] 2020-09-10 12:52:13,825 [main] org.springframework.web.context.ContextLoader - Root WebApplicationContext: initialization completed in 2468 ms
openvidu-server_1 | [INFO] 2020-09-10 12:52:14,152 [main] io.openvidu.server.config.OpenviduConfig - Configuration properties read from file /opt/openvidu/.env
openvidu-server_1 | [INFO] 2020-09-10 12:52:14,202 [main] io.openvidu.server.pro.OpenViduServerPro - OpenVidu Pro is deployed in ‘on_premise’ environment
openvidu-server_1 | [WARN] 2020-09-10 12:52:14,202 [main] io.openvidu.server.pro.OpenViduServerPro - OpenVidu Pro cluster mode is ‘manual’. There will be no automatic instances management
openvidu-server_1 | [INFO] 2020-09-10 12:52:14,223 [main] io.openvidu.server.pro.OpenViduServerPro - OpenVidu Pro cluster mode enabled
openvidu-server_1 | [INFO] 2020-09-10 12:52:14,224 [main] io.openvidu.server.pro.OpenViduServerPro - Cluster identifier got from disk: clu_EVqEXlyW
openvidu-server_1 | [INFO] 2020-09-10 12:52:14,234 [main] io.openvidu.server.pro.OpenViduServerPro - OpenVidu Pro Elasticsearch service is enabled
openvidu-server_1 | [INFO] 2020-09-10 12:52:15,342 [main] io.openvidu.server.pro.cdr.CDRLoggerElasticSearch - Elasticsearch is accessible at 127.0.0.1:9200
openvidu-server_1 | [INFO] 2020-09-10 12:52:15,468 [main] io.openvidu.server.pro.cdr.CDRLoggerElasticSearch - Elasticsearch version is 7.8.0
openvidu-server_1 | [INFO] 2020-09-10 12:52:15,519 [main] io.openvidu.server.pro.cdr.CDRLoggerElasticSearch - Elasticsearch index “openvidu” already exists
openvidu-server_1 | [INFO] 2020-09-10 12:52:15,520 [main] io.openvidu.server.pro.OpenViduServerPro - OpenVidu Pro CDR service is disabled (may be enable with ‘OPENVIDU_CDR=true’)
openvidu-server_1 | [INFO] 2020-09-10 12:52:15,520 [main] io.openvidu.server.pro.OpenViduServerPro - OpenVidu Pro Webhook service is disabled (may be enabled with ‘OPENVIDU_WEBHOOK=true’)
openvidu-server_1 | [INFO] 2020-09-10 12:52:15,525 [main] io.openvidu.server.kurento.kms.KmsManager - OpenVidu Server Pro is deployed with ‘OPENVIDU_PRO_CLUSTER_MODE’ set to ‘manual’. Initializing Media Nodes defined in parameter ‘KMS_URIS’: [ws://172.16.4.5:8888/kurento]
openvidu-server_1 | [INFO] 2020-09-10 12:52:15,630 [JsonRpcClient-hearbeatExec-e1-t0] org.kurento.jsonrpc.client.JsonRpcClientNettyWebSocket - [KurentoClient] Connecting native client
openvidu-server_1 | [INFO] 2020-09-10 12:52:15,630 [JsonRpcClient-hearbeatExec-e1-t0] org.kurento.jsonrpc.client.JsonRpcClientNettyWebSocket - [KurentoClient] Creating new NioEventLoopGroup
openvidu-server_1 | [INFO] 2020-09-10 12:52:15,663 [I/O dispatcher 1] io.openvidu.server.pro.cdr.CDRLoggerElasticSearch - New event of type “cdr” sent to Elasticsearch: {“timestamp”:1599742335541,“id”:“kms_ZUG8oadr”,“environmentId”:null,“ip”:“172.16.4.5”,“uri”:“ws://172.16.4.5:8888/kurento”,“newStatus”:“launching”,“oldStatus”:null,“clusterId”:“clu_EVqEXlyW”,“event”:“mediaNodeStatusChanged”,“elastic_type”:“cdr”}
openvidu-server_1 | [INFO] 2020-09-10 12:52:15,862 [nioEventLoopGroup-2-1] org.kurento.jsonrpc.client.JsonRpcClientNettyWebSocket - [KurentoClient] Initiating new Netty channel. Will create new handler too!
openvidu-server_1 | [WARN] 2020-09-10 12:52:15,935 [JsonRpcClient-hearbeatExec-e1-t0] org.kurento.jsonrpc.client.JsonRpcClientNettyWebSocket - [KurentoClient] Trying to close a JsonRpcClientNettyWebSocket with channel == null
openvidu-server_1 | [INFO] 2020-09-10 12:52:15,951 [main] org.kurento.jsonrpc.client.JsonRpcClientNettyWebSocket - [KurentoClient] Connecting native client
openvidu-server_1 | [WARN] 2020-09-10 12:52:15,951 [JsonRpcClient-hearbeatExec-e1-t0] org.kurento.jsonrpc.client.JsonRpcClient - [KurentoClient] Error sending heartbeat to server. Exception: [KurentoClient] Exception connecting to WebSocket server ws://172.16.4.5:8888/kurento
openvidu-server_1 | [INFO] 2020-09-10 12:52:15,951 [main] org.kurento.jsonrpc.client.JsonRpcClientNettyWebSocket - [KurentoClient] Creating new NioEventLoopGroup
openvidu-server_1 | [WARN] 2020-09-10 12:52:15,951 [JsonRpcClient-hearbeatExec-e1-t0] org.kurento.jsonrpc.client.JsonRpcClient - [KurentoClient] Stopping heartbeat and closing client: failure during heartbeat mechanism
openvidu-server_1 | [INFO] 2020-09-10 12:52:15,954 [nioEventLoopGroup-3-1] org.kurento.jsonrpc.client.JsonRpcClientNettyWebSocket - [KurentoClient] Initiating new Netty channel. Will create new handler too!
openvidu-server_1 | [WARN] 2020-09-10 12:52:15,957 [main] org.kurento.jsonrpc.client.JsonRpcClientNettyWebSocket - [KurentoClient] Trying to close a JsonRpcClientNettyWebSocket with channel == null
openvidu-server_1 | [ERROR] 2020-09-10 12:52:15,963 [main] io.openvidu.server.kurento.kms.KmsManager - OpenVidu Server couldn’t connect to KMS with uri ws://172.16.4.5:8888/kurento
openvidu-server_1 | [ERROR] 2020-09-10 12:52:15,964 [main] io.openvidu.server.kurento.kms.KmsManager - None of the KMSs in [ws://172.16.4.5:8888/kurento] are within reach of OpenVidu Server
openvidu-server_1 | [ERROR] 2020-09-10 12:52:15,964 [main] io.openvidu.server.kurento.kms.KmsManager - Shutting down OpenVidu Server

docker ps:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
e060d5d6f1c2 openvidu/media-node-controller:1.0.0 “/bin/sh -c '/beats/…” 2 hours ago Up 2 hours 0.0.0.0:3000->3000/tcp kms_media-node-controller_1

I think the property OPENVIDU_PRO_CLUSTER_MODE is not set to manual

Could you change or verify that you have this property in your .env:

# Mode of cluster management. Can be auto (OpenVidu manages Media Nodes on its own.
# Parameter KMS_URIS is ignored) or manual (user must manage Media Nodes. Parameter
# KMS_URIS is used: if any uri is provided it must be valid)
OPENVIDU_PRO_CLUSTER_MODE=manual

Restart with this property your OpenVidu Node

Regards,
Carlos

Brilliant! Many thanks for your help - we are now up and running :slight_smile:

1 Like