How do you go live?
- Press the red button in the bottom right corner of the screen
Why do I get "null error" on iOS when trying to do a screen capture?
- Known iOS issue that happens to other apps as well, typically when trying screen capture. References:
- https://discussions.apple.com/thread/8344379
- https://www.mobilescout.com/apple/news/n103281/Facebook-iOS-acces-iOSS-11-screen-recording.html
- https://www.reddit.com/r/Twitch/comments/982kjh/streaming_null_error_iphone_please_help/
- https://community.teamviewer.com/t5/iOS/Live-Broadcast-to-TeamViewer-has-stopped-due-to-null/td-p/32767
Why doesn't audio work?
- Depending on the issue. If you can’t hear alerts sounds, the sound file might be the incorrect type. In order to work on iOS the file must be an AAC, MP3 or WAV. Android can run WAV, MP3, AAC, OGG.
- If you can’t hear text to speech, only Android supports this right now.
- If you aren’t hearing audio from the mic on the stream, check the mute button. Bluetooth mics and headphones are having known issues right now and we are looking into how to resolve this.
- Android: If you are using headphones and expect to hear the alert sounds on the stream: that will not work, Android doesn’t support capturing the internal audio. Only what is being captured via the microphone can end up on the stream.
Why aren't alerts on screen capture?
- Were the alerts set as “Show on Preview” and user expects to see them while outside our app?
- When screen capturing, the app is in the background and effectively suspended so it can’t render things inside of another app.
- [iOS only] Does user expect them over the stream while screen broadcasting?
- Widgets consume a lot of system resources, and are disabled during screen broadcasting on iOS in order to reduce the very likely risk of the app being killed off by the system. This limitation is enforced by Apple because we are running as a broadcast extension, and have limited resources we are allowed to use. When that limit is reached, the system kills our extension.
- We are working on the ability to use push notifications to show alerts so they can be seen over on top of other apps.
Why isn't my external mic working?
- The audio must be set as “external” or “default” in app settings. Microphones connected via 3.5mm jack should work. Some Bluetooth devices might work, others would need special support by the app if there are individual brand compatibility that need to be accounted for (we’d need specific brand, connection type can differ, etc…)
Why isn't my external camera working?
- Every external camera is different. Hardware compatibility that is tied from a specific device to a specific app (like a Go Pro and the Go Pro app) will work, but in order for us to support external cameras, we would need to configure each one individually. In the future we might whitelist a number of devices that we can make compatible, but for now it is something we don’t support. External cams are not the same thing as the internal Camera hardware and there’s no universal way to use them.
Why do audio files in iOS need to be converted by wav file by the user, why can't the server do that?
- This could be done on the server (but that’s for web team to prioritize, mobile uses the Streamlabs BE and doesn’t have their own)
Why can't the camera settings save when restarting the app?
- Good question. The settings for every camera of every device can vary drastically. The intention for having the settings was to do minor tweaks after entering the default state, we don’t currently persist the changes done by the user.
Why doesn't the stream do 60 fps when it's set to do so?
- Encoder FPS is enforced by camera hardware and Android API. We support 60fps whenever possible. In many cases the manufacturer’s Camera app offers 60 fps but it bypasses the Android API to allow this, and we don’t have access to that (us or any other app as well). As a result they might only present 30 fps as the max. FPS, and only allow 60 FPS in their own dedicated manufacturer’s app.
Why does the stream lag when connection is good? Live <> real-time
- Stream delay depends entirely on whatever happens between the broadcaster’s app and the viewer’s app. The frames that get encoded are sent immediately over the network at best effort. Assuming the connection handles the throughput, further delay is added sequentially by the RTMP ingest server, transcoding servers, CDNs, and finally by the viewer player’s buffering (e.g. HTML5 browser and the protocol used to access the live stream). In short, there’s no direct line between the app and the viewers.
- To confirm these facts simply stream to a local RTMP server and play it back with zero buffering. The stream will play in real-time. This is the closest to a lab test as it can get.
Issues that may not be directly related to the mobile app...
- Recent Events stopped working suddenly
- If an update was made on the web-side, it may not be sending information properly
- If a persistent white box is showing even after several tries, it indicates something went wrong inside the web page (like un-intended JavaScript errors).
- Alert message text does not show on screen
- Could be restrictions set up in user settings like: enabled/disabled toggle, min donation requirements, font size, other formatting options
- Could be restrictions set up in user settings like: enabled/disabled toggle, min donation requirements, font size, other formatting options
- Alert sounds not playing
- iOS does not support audio formats like OGG (recommended: AAC, MP3, or WAV)
- iOS does not support audio formats like OGG (recommended: AAC, MP3, or WAV)
- Alert videos not playing
- iOS does not support WEBM video codec / format. It will be shown as a black box at best. Recommended format: H.264.
- iOS does not support WEBM video codec / format. It will be shown as a black box at best. Recommended format: H.264.
- Text To Speech
- Only works on Android ATM because the text file gets converted into an OGG file which doesn’t play on iOS (we are currently on an iOS solution that involves using AAC file formats for TTS)
- Only works on Android ATM because the text file gets converted into an OGG file which doesn’t play on iOS (we are currently on an iOS solution that involves using AAC file formats for TTS)
- Login not working
- We pull this directly from streamlabs.com so the protocol is what is implemented there. Login failures depend entirely on the web flow, or even on the auth providers (such as Twitch or YouTube).
- We pull this directly from streamlabs.com so the protocol is what is implemented there. Login failures depend entirely on the web flow, or even on the auth providers (such as Twitch or YouTube).
- Dashboard functions not working
- Also pulled in from the web, functioning inside a browser window
- Also pulled in from the web, functioning inside a browser window
- Getting "Please try again" error in iOS
- Please try updating to iOS 12.1+
- Also try closing any other running apps