Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

GetUserMedia() does not work for Safari and IOS devices #1550

Open
0010SS opened this issue Mar 18, 2024 · 17 comments
Open

GetUserMedia() does not work for Safari and IOS devices #1550

0010SS opened this issue Mar 18, 2024 · 17 comments

Comments

@0010SS
Copy link

0010SS commented Mar 18, 2024

Hi! I am currently developing an app using the flutter_webrtc package. It turns out that when I call WebRTC's getUserMedia method for my Flutter web app as below, it does not prompt me to check for camera permission, and the whole app is just stuck. This situation happens with Mac Safari and both IOS Chrome and Safari.

var stream = await navigator.mediaDevices.getUserMedia(mediaConstraints);
_localStream = stream;
_localRenderer.srcObject = _localStream;
stream.getTracks().forEach((element) {
  _peerConnection!.addTrack(element, stream); 
});

This code is directly from the official flutter_webrtc sample: https://github.com/flutter-webrtc/flutter-webrtc/blob/main/example/lib/src/get_user_media_sample.dart. The version of my flutter_webrtc package is 0.9.47, the version for my Mac safari is 17.3, the version for my Dart SDK is 3.2.3, and the version for my Flutter is 3.16.5.

In addition, I've found out the following code snippet from the compiled Javascript file main.dart.js. It seems that Flutter calls a specific method called webkitGetUserMedia which seems to fit into Safari, but it just doesn't work. Probably due to some restrictions by Apple?

A.zu.prototype = {
        WO(a, b, c) {
            var s = new A.ay($.ah, t.xN),
                r = new A.bF(s, t.Rt),
                q = A.ar(["audio", b, "video", c], t.N, t.z),
                p = !a.getUserMedia
            p.toString
            if (p)
                a.getUserMedia = a.getUserMedia || a.webkitGetUserMedia || a.mozGetUserMedia || a.msGetUserMedia
            this.a51(a, new A.W_([], []).li(q), new A.a86(r), new A.a87(r))
            return s
        },
        a51(a, b, c, d) {
            return a.getUserMedia(b, A.ji(c, 1), A.ji(d, 1))
        }
}

I greatly appreciate anyone who knows the solution to this problem so that getUserMedia can work for Flutter Web App for IOS devices.

@ArjunBhilare
Copy link

I'm facing the same issue. Any solution?

@0010SS
Copy link
Author

0010SS commented Mar 20, 2024

I'm facing the same issue. Any solution?

For me, the issue is not with GetUserMedia(). I found out in the WebRTC console log of Safari that it requires Member RTCIceServer.urls, and hence changing ICE Server configuration keys in the Peer Connection Configuration from "url" to "urls" solve the issue in Safari. Namely,

final configuration = <String, dynamic>{
  "iceServers": [
    {
      "urls": "..",  // the key here should be "urls" not "url"
      "username": "...",
      "credential": "..."
    },
  ],
  'sdpSemantics': 'unified-plan',
};

_peerConnection = await createPeerConnection(configuration);

Now, GetUserMedia() works, and the browsers on both my iPhone and Mac work to request the camera and render the stream. However, the WebRTC connection still does not work for browsers, either Chrome or Safari, on my iPhone. The media stream track is connected, but no frames are sent from the iPhone to the other peer. I believe this is a problem with IOS, but I'm still working it out. Does anyone have a solution to this?

@ArjunBhilare
Copy link

ArjunBhilare commented Mar 20, 2024

I am building flutter-webrtc based app on iOS and Android. I have

final Map<String, dynamic> configuration = {
    "sdpSemantics": "plan-b",
    'iceServers': [
      {
        "urls": [
          'stun:stun.l.google.com:19302',
          'stun:stun1.l.google.com:19302'
        ],
      }
    ]
  };
  1. In this when I set

'sdpSemantics': 'unified-plan',

then the app crashes on user joined

  1. Also I have
final _localRenderer = RTCVideoRenderer();

await _localRenderer.initialize();

MediaStream _localStream =
     await navigator.mediaDevices.getUserMedia(mediaConstraints);

_localRenderer.srcObject = _localStream;
 
 Positioned(
       bottom: 10,
       right: 10,
       child: SizedBox(
         width: 150,
         height: 200,
         child: RTCVideoView(
           _localRenderer,
           mirror: true,
           objectFit: RTCVideoViewObjectFit.RTCVideoViewObjectFitCover,
         ),
       ),
     ),

The camera is turning ON but the RTCVideoView _localRenderer shows nothing, also no video stream is shown to the other user. Any insights on this?

The audio works, but the video is not shown, on iOS

@cloudwebrtc
Copy link
Member

I think you need to call _localRenderer.srcObject = _localStream; in setState to notify flutter to update your widget.

setState((){
   _localRenderer.srcObject = _localStream;
});

@Garciconx
Copy link

Garciconx commented Mar 20, 2024

I'm facing the same issue. Any solution?

For me, the issue is not with GetUserMedia(). I found out in the WebRTC console log of Safari that it requires Member RTCIceServer.urls, and hence changing ICE Server configuration keys in the Peer Connection Configuration from "url" to "urls" solve the issue in Safari. Namely,

final configuration = <String, dynamic>{
  "iceServers": [
    {
      "urls": "..",  // the key here should be "urls" not "url"
      "username": "...",
      "credential": "..."
    },
  ],
  'sdpSemantics': 'unified-plan',
};

_peerConnection = await createPeerConnection(configuration);

Now, GetUserMedia() works, and the browsers on both my iPhone and Mac work to request the camera and render the stream. However, the WebRTC connection still does not work for browsers, either Chrome or Safari, on my iPhone. The media stream track is connected, but no frames are sent from the iPhone to the other peer. I believe this is a problem with IOS, but I'm still working it out. Does anyone have a solution to this?

I'm facing the same issue I already have the urls key instead of url but keep failing on Safari and Chrome for iOS and on Safari for MacOS

Any possible solution? CC @cloudwebrtc @0010SS

@0010SS
Copy link
Author

0010SS commented Mar 20, 2024

I'm facing the same issue. Any solution?

For me, the issue is not with GetUserMedia(). I found out in the WebRTC console log of Safari that it requires Member RTCIceServer.urls, and hence changing ICE Server configuration keys in the Peer Connection Configuration from "url" to "urls" solve the issue in Safari. Namely,

final configuration = <String, dynamic>{
  "iceServers": [
    {
      "urls": "..",  // the key here should be "urls" not "url"
      "username": "...",
      "credential": "..."
    },
  ],
  'sdpSemantics': 'unified-plan',
};

_peerConnection = await createPeerConnection(configuration);

Now, GetUserMedia() works, and the browsers on both my iPhone and Mac work to request the camera and render the stream. However, the WebRTC connection still does not work for browsers, either Chrome or Safari, on my iPhone. The media stream track is connected, but no frames are sent from the iPhone to the other peer. I believe this is a problem with IOS, but I'm still working it out. Does anyone have a solution to this?

I'm facing the same issue I already have the urls key instead of url but keep failing on Safari and Chrome for iOS and on Safari for MacOS

Any possible solution? CC @cloudwebrtc @0010SS

Could you provide a snapshot of your code?

@Garciconx
Copy link

Garciconx commented Mar 20, 2024

Map<String, dynamic> configuration = {
    'iceServers': [
      {
        'urls': [
          'stun:stun1.l.google.com:19302',
          'stun:stun2.l.google.com:19302',
        ]
      }
    ]
  };

CC @0010SS

@Garciconx
Copy link

Any possible solution? @cloudwebrtc

@0010SS
Copy link
Author

0010SS commented Mar 22, 2024

Any possible solution? @cloudwebrtc

Have you tried to check out the WebRTC console log for Safari? You can probably spot the errors there and see how they go.

@Garciconx
Copy link

Garciconx commented Mar 22, 2024

CC @0010SS @cloudwebrtc

Screenshot 2024-03-22 at 8 58 53 a m Screenshot 2024-03-22 at 8 59 42 a m Screenshot 2024-03-22 at 9 00 47 a m

image

´´´

Timer(const Duration(seconds: 2), () {
  peer_connection.onTrack = (RTCTrackEvent event) {
    debugPrint('Got remote track: ${event.streams[0]}');

    event.streams[0].getTracks().forEach((track) {
      debugPrint('Add a track to the remoteStream $track');
      remote_streams.last.addTrack(track);
    });
  };
});

´´´

@Garciconx
Copy link

Garciconx commented Mar 22, 2024

CC @0010SS @cloudwebrtc

The problem is that It's failing to add the track on Safari, this issue is not presented on Google Chrome, Android nor iOS.

On the Mobile App, the code is capable of adding both tracks local and remote but Safari fails to add the remote track.

@0010SS
Copy link
Author

0010SS commented Mar 22, 2024

CC @0010SS @cloudwebrtc

The problem is that It's failing to add the track on Safari, this issue is not presented on Google Chrome, Android nor iOS.

On the Mobile App, the code is capable of adding both tracks local and remote but Safari fails to add the remote track.

Have you checked out the WebRTC console log of Safari to see whether there are any errors? That's how I solve the problem.

@Garciconx
Copy link

CC @0010SS @cloudwebrtc
The problem is that It's failing to add the track on Safari, this issue is not presented on Google Chrome, Android nor iOS.
On the Mobile App, the code is capable of adding both tracks local and remote but Safari fails to add the remote track.

Have you checked out the WebRTC console log of Safari to see whether there are any errors? That's how I solve the problem.

Yes I posted a Screenshot and the code in my past comment

@Garciconx
Copy link

Safari doesn't tell me much about the error, the browser just tells the line of code where it is

@harshmdr-devslane
Copy link

Any update here?

@yevgeniaronov
Copy link

managed to fix this by adding the 'mandatory' object to 'video':

  var stream = await navigator.mediaDevices.getUserMedia({
        'video': {
          'facingMode': 'user',
          'mandatory': {
            'minWidth': '640',
            'minHeight': '480',
          },
        },
        'audio': kReleaseMode,
      });

i actually don't understand the logic behind this, at least there should be a warning to notify that it's required.

@bhaskarblur
Copy link

Worked! Thanks

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants