项目作者: infobip

项目描述 :
Infobip RTC SDK for iOS
高级语言:
项目地址: git://github.com/infobip/infobip-rtc-ios.git
创建时间: 2019-05-08T16:49:56Z
项目社区:https://github.com/infobip/infobip-rtc-ios

开源协议:

下载


Introduction

Infobip RTC is an iOS SDK which enables you to take advantage of Infobip platform, giving you the ability to enrich your
applications with real-time communications in minimum time, while you focus on your application’s user experience and
business logic. We currently support WebRTC calls between two web or app users, phone calls between a web or app user
and user behind called phone number, Viber calls, calls to the Infobip Conversations platform, as well as room calls
which multiple participants can join.

Here you will find an overview and a quick guide on how to connect to Infobip platform. There is also in-depth reference
documentation available here.

First-time setup

In order to use Infobip RTC, you need to have Web and In-app Calls enabled on your account and that’s it! You are ready
to make Web and In-app calls. To learn how to enable them
see the documentation.

System Requirements

The Infobip RTC iOS SDK is supported on iOS 10.0 or above.

Supported Swift version is 5.1 or above.

Getting the SDK

There are several ways to install our SDK. We publish it on CocoaPods, Swift Package Manager and Carthage.

CocoaPods

If you want to add it as a CocoaPods dependency, add the following to your Podfile:

  1. pod 'InfobipRTC'

To install newly added dependencies, simply run pod install.

Swift Package Manager

If you want to install our SDK using Swift Package Manager, add the GitHub repository
https://github.com/infobip/infobip-rtc-ios/ as a Swift Package.

Carthage

If you want to use Carthage dependency manager, add these dependencies to your Cartfile:

  1. github "infobip/infobip-rtc-ios" ~> 2.0.0
  2. binary "https://rtc.cdn.infobip.com/webrtc/ios/releases.json" >= 1.0.37785

When using it for the first time, run carthage bootstrap --use-xcframeworks. Otherwise,
run carthage update --use-xcframeworks to update dependencies.

Find InfobipRTC.xcframework in the Carthage/Build folder and drag and drop it in
the Frameworks, Libraries, and Embedded Content section of your application target’s General settings.

Using the SDK

Once the SDK is installed, it is available for use in your project as:

  1. import InfobipRTC

Authentication

Since Infobip RTC is an SDK, it means you develop your own application, and you only use Infobip RTC as a dependency. We
will refer to your application users as subscribers throughout this guide. So, in order to use Infobip RTC, you need to
register your subscribers on our platform. The credentials your subscribers use to connect to your application are
irrelevant to Infobip. We only need the identity they will use to present themselves on our platform. When we have the
subscriber’s identity, we can generate a token assigned to that specific subscriber. Using that token, your subscribers
are able to connect to our platform (using Infobip RTC SDK).

To generate these tokens for your subscribers, you need to call our
/webrtc/1/token HTTP API
endpoint using proper parameters. After you successfully authenticated your subscribers against Infobip platform, we can
relate their token to your application. Typically, generating a token occurs after your subscribers are authenticated
inside your application. You will receive the token in the response that you will use to make and receive calls via
InfobipRTC client in your mobile application.

Application Permissions

In order to use Infobip RTC in your application, it is required to grant following permissions:

Record permission

Before making any call, make sure to request permission to record.

Example:

  1. AVAudioSession.sharedInstance().requestRecordPermission { granted in
  2. if granted {
  3. // The user granted access. Present recording interface.
  4. }
  5. }

Please check
the official documentation
for additional details.

Camera permission

Make sure Camera permission is requested before making or switching to video call.

Example:

  1. AVCaptureDevice.requestAccess(for: AVMediaType.video) { response in
  2. if granted {
  3. // The user granted access. Present camera interface.
  4. }
  5. }

Please check
the official documentation
for additional details.

Before you make and receive calls

Keep in mind that making and receiving calls on iOS requires you to
use CallKit.
This enables you to display the system-calling UI and coordinate your calling services with other apps and the system.

Getting an InfobipRTC instance

To utilize all the functionalities of InfobipRTC client, you need to obtain an instance of InfobipRTC.
This is done via calling a globally exposed
function getInfobipRTCInstance:

  1. let infobipRTC = getInfobipRTCInstance()

Making a WebRTC call

You can call another subscriber, if you know their identity. This is done via
the callWebrtc method:

  1. let token = obtainToken()
  2. let infobipRTC = getInfobipRTCInstance()
  3. let callWebrtcRequest = CallWebrtcRequest(token, destination: "Alice", webrtcCallEventListener: self)
  4. let webrtcCall = infobipRTC.callWebrtc(callWebrtcRequest)

As you can see, the callWebrtc method
returns an instance of WebrtCall as a result. With it,
you can track the status of your call and respond to events, such as:

  • called subscriber answered the call
  • called subscriber rejected the call
  • the call has ended

The WebrtcCallEventListener, passed as the third parameter, is used for receiving events from the SDK, and can be set
up using the following code:

  1. class RTCWebrtcCallEventListener : WebrtcCallEventListener {
  2. func onRinging(_ callRingingEvent: CallRingingEvent) {
  3. os_log("Call is ringing.")
  4. }
  5. func onEarlyMedia(_ callEarlyMediaEvent: CallEarlyMediaEvent) {
  6. os_log("Received early media.")
  7. }
  8. func onEstablished(_ callEstablishedEvent: CallEstablishedEvent) {
  9. os_log("Call established.")
  10. }
  11. func onCameraVideoAdded(_ cameraVideoAddedEvent: CameraVideoAddedEvent) {
  12. os_log("Camera video added.")
  13. }
  14. func onCameraVideoUpdated(_ cameraVideoUpdatedEvent: CameraVideoUpdatedEvent) {
  15. os_log("Camera video updated.")
  16. }
  17. func onCameraVideoRemoved() {
  18. os_log("Camera video removed.")
  19. }
  20. func onScreenShareAdded(_ screenShareAddedEvent: ScreenShareAddedEvent) {
  21. os_log("Screen share added.")
  22. }
  23. func onScreenShareRemoved(_ screenShareRemovedEvent: ScreenShareRemovedEvent) {
  24. os_log("Screen share removed.")
  25. }
  26. func onRemoteCameraVideoAdded(_ cameraVideoAddedEvent: CameraVideoAddedEvent) {
  27. os_log("Remote camera video added.")
  28. }
  29. func onRemoteCameraVideoRemoved() {
  30. os_log("Remote camera video removed.")
  31. }
  32. func onRemoteScreenShareAdded(_ screenShareAddedEvent: ScreenShareAddedEvent) {
  33. os_log("Remote screen share added.")
  34. }
  35. func onRemoteScreenShareRemoved() {
  36. os_log("Remote screen share removed.")
  37. }
  38. func onRemoteMuted() {
  39. os_log("Remote endpoint muted.")
  40. }
  41. func onRemoteUnmuted() {
  42. os_log("Remote endpoint unmuted.")
  43. }
  44. func onHangup(_ callHangupEvent: CallHangupEvent) {
  45. os_log("Call ended.")
  46. }
  47. func onError(_ errorEvent: ErrorEvent) {
  48. os_log("An error has occurred.")
  49. }
  50. func onReconnecting(_ callReconnectingEvent: CallReconnectingEvent) {
  51. os_log("Call reconnecting.")
  52. }
  53. func onReconnected(_ callReconnectedEvent: CallReconnectedEvent) {
  54. os_log("Call reconnected.")
  55. }
  56. func onRemoteDisconnected(_ remoteDisconnectedEvent: RemoteDisconnectedEvent) {
  57. os_log("Remote endpoint disconnected.")
  58. }
  59. func onRemoteReconnected(_ remoteReconnectedEvent: RemoteReconnectedEvent) {
  60. os_log("Remote endpoint reconnected.")
  61. }
  62. }

When WebrtcCallEventListener is set up, and the call is established, there are a few things that you can do with the
actual call. One of them is to hang up the call, which can be done via
the hangup method. Upon completion, both endpoints will
receive the CallHangupEvent.

  1. webrtcCall.hangup()

You can simulate digit press during the call by sending DTMF codes (Dual-Tone Multi-Frequency). This is achieved via
the sendDTMF method. Valid DTMF codes are digits 0
-9, letters Á to D, symbols * and #.

  1. webrtcCall.sendDTMF('*')

During the call, you can also mute (and unmute) your audio:

  1. webrtcCall.mute(true)

Or you can play media on the speakerphone:

  1. webrtcCall.speakerphone(true)

To have better control over all connected audio devices, such as bluetooth
headsets, check out
our audio device manager
.

Also, you can check the call status:

  1. let status = webrtcCall.status

Making a phone call

It is similar to calling a regular WebRTC user, you just use
the callPhone method instead
of callWebrtc. This method accepts an
optional second parameter, where you define the from parameter. Its value will be displayed on the called phone as the
Caller ID. The result of the callPhone is an
instance of PhoneCall on which you can do a several
actions, such as muting the call, hanging it up, checking its start time, answer time, duration and more.

  1. let token = obtainToken()
  2. let infobipRTC = getInfobipRTCInstance()
  3. let callPhoneRequest = CallPhoneRequest(token, destination: "41793026727", phoneCallEventListener: self)
  4. let phoneCallOptions = PhoneCallOptions(from: "33755531044")
  5. let phoneCall = infobipRTC.callPhone(callPhoneRequest, phoneCallOptions)

Making a Viber call

Using the callViber method is similar to
previously described methods. In this case, call’s destination is Viber application. Unlike in
the callPhone method, from is required and
is passed as part of the CallViberRequest.
Additionally, it has to be a Viber Voice number. The result of the
callViber is an instance of
ViberCall on which you can do a several actions, such as
muting the call, hanging it up, checking its start time, answer time, duration and more.

  1. let token = obtainToken()
  2. let infobipRTC = getInfobipRTCInstance()
  3. let callViberRequest = CallViberRequest(token, destination: "41793026727", from: "41727620397", viberCallEventListener: self)
  4. let viberCall = infobipRTC.callViber(callViberRequest)

Receiving a WebRTC call

Note: In order for push notifications to work, they have to be enabled for your application, as explained in
the documentation.

In order to be able to receive incoming WebRTC calls, your application needs to support several things:

  • VoIP Background mode enabled - Xcode Project > Capabilites > Background Modes and make sure the following
    options are checked:
    • Voice over IP
    • Background fetch
    • Remote notifications
  • Push Notifications enabled - Xcode Project > Capabilites > Push Notifications
  • Voip Services Certificate - Log into your Apple developer account, find your app under Identifiers option, enable
    Push Notifications and generate new certificate following the instructions from Apple. Go back to your MacBook and
    import the generated certificate in your Keychain and then export it as .p12 file, which will be used later to send
    push notifications.

Once the configuration is done, your application must register for push notifications, and you have to set up the
PKPushRegistryDelegate and WebrtcCallEventListener using following code:

  1. class MainController: PKPushRegistryDelegate, IncomingCallEventListener {
  2. private var voipRegistry: PKPushRegistry
  3. init() {
  4. voipRegistry = PKPushRegistry(queue: DispatchQueue.main)
  5. voipRegistry.desiredPushTypes = [PKPushType.voIP]
  6. voipRegistry.delegate = self
  7. }
  8. var infobipRTC: InfobipRTC {
  9. get {
  10. return getInfobipRTCInstance()
  11. }
  12. }
  13. func pushRegistry(_ registry: PKPushRegistry, didUpdate pushCredentials: PKPushCredentials, for type: PKPushType) {
  14. if type == .voIP {
  15. let token = obtainToken()
  16. let debug = isDebug()
  17. infobipRTC.enablePushNotification(token, pushCredentials: pushCredentials, debug: debug, pushConfigId: "454d142b-a1ad-239a-d231-227fa335aadc3")
  18. }
  19. }
  20. func pushRegistry(_ registry: PKPushRegistry, didReceiveIncomingPushWith payload: PKPushPayload, for type: PKPushType) {
  21. if type == .voIP {
  22. os_log("Received VoIP Push Notification %@", payload)
  23. if infobipRTC.isIncomingBasicCall(payload) {
  24. infobipRTC.handleIncomingCall(payload, self)
  25. }
  26. }
  27. }
  28. func pushRegistry(_ registry: PKPushRegistry, didInvalidatePushTokenFor type: PKPushType) {
  29. let token = obtainToken()
  30. infobipRTC.disablePushNotification(token)
  31. }
  32. func onIncomingWebrtcCall(_ incomingWebrtcCallEvent: IncomingWebrtcCallEvent) {
  33. let incomingWebrtcCall = incomingWebrtcCallEvent.incomingWebrtcCall
  34. // Don't forget to register this call to CallKit
  35. incomingWebrtcCall.webrtcCallEventListener = WebrtcCallListener(incomingWebrtcCall)
  36. incomingWebrtcCall.accept() // or incomingWebrtcCall.decline()
  37. }
  38. func isDebug() -> Bool {
  39. #if DEBUG
  40. return true
  41. #else
  42. return false
  43. #endif
  44. }
  45. }
  46. class WebrtcCallListener: WebrtcCallListener {
  47. let webrtcCall : WebrtcCall
  48. init(_ webrtcCall: WebrtcCall) {
  49. self.webrtcCall = webrtcCall
  50. }
  51. ...
  52. }

Receiving a WebRTC call on Simulator

Since push notifications are not available on simulator devices, in order to test incoming calls you can create
InfobipSimulator instance when creating Push Registry:

  1. let token = obtainToken()
  2. var pushRegistry = InfobipSimulator(token: token)

Joining a room call

You can join a room call with other WebRTC endpoints. The room call will start as soon as at least one participant
joins.

Room can be joined by up to 15 participants, simultaneously.

Joining the room is done via the joinRoom
method:

  1. let token = obtainToken()
  2. let infobipRTC = getInfobipRTCInstance()
  3. let roomCallRequest = RoomCallRequest(token, roomName: "room-demo", roomCallEventListener: self)
  4. let room = infobipRTC.joinRoom(roomCallRequest)

As you can see, the joinRoom method returns an
instance of RoomCall as a result. With it, you can track
the status of your room call and respond to events, such as:

  • another participant joined the room
  • participant left the room
  • participant muted/unmuted

The RoomCallEventListener, passed as the third parameter, is used for receiving events from the SDK, and can be set up
using the following code:

  1. class RTCRoomCallEventListener: RoomCallEventListener {
  2. func onError(_ errorEvent: ErrorEvent) {
  3. os_log("An error has occurred.")
  4. }
  5. func onRoomJoined(_ roomJoinedEvent: RoomJoinedEvent) {
  6. os_log("You have joined the room.")
  7. }
  8. func onRoomLeft(_ roomLeftEvent: RoomLeftEvent) {
  9. os_log("You have left the room.")
  10. }
  11. func onParticipantJoining(_ participantJoiningEvent: ParticipantJoiningEvent) {
  12. os_log("Participant joining the room.")
  13. }
  14. func onParticipantJoined(_ participantJoinedEvent: ParticipantJoinedEvent) {
  15. os_log("Participant joined the room.")
  16. }
  17. func onParticipantLeft(_ participantLeftEvent : ParticipantLeftEvent) {
  18. os_log("Participant left the room.")
  19. }
  20. func onParticipantMuted(_ participantMutedEvent: ParticipantMutedEvent) {
  21. os_log("Participant muted themself.")
  22. }
  23. func onParticipantUnmuted(_ participantUnmutedEvent: ParticipantUnmutedEvent) {
  24. os_log("Participant unmuted themself.")
  25. }
  26. func onParticipantDeaf(_ participantDeafEvent: ParticipantDeafEvent) {
  27. os_log("Participant deafened themself.")
  28. }
  29. func onParticipantUndeaf(_ participantUndeafEvent: ParticipantUndeafEvent) {
  30. os_log("Participant undeafened themself.")
  31. }
  32. func onParticipantStartedTalking(_ participantStartedTalkingEvent: ParticipantStartedTalkingEvent) {
  33. os_log("Participant started talking.")
  34. }
  35. func onParticipantStoppedTalking(_ participantStoppedTalkingEvent: ParticipantStoppedTalkingEvent) {
  36. os_log("Participant stopped talking.")
  37. }
  38. func onCameraVideoAdded(_ cameraVideoAddedEvent: CameraVideoAddedEvent) {
  39. os_log("Camera video added.")
  40. }
  41. func onCameraVideoUpdated(_ cameraVideoUpdatedEvent: CameraVideoUpdatedEvent) {
  42. os_log("Camera video updated.")
  43. }
  44. func onCameraVideoRemoved() {
  45. os_log("Camera video removed.")
  46. }
  47. func onScreenShareAdded(_ screenShareAddedEvent: ScreenShareAddedEvent) {
  48. os_log("Screen share started.")
  49. }
  50. func onScreenShareRemoved(_ screenShareRemovedEvent: ScreenShareRemovedEvent) {
  51. os_log("Screen share stopped.")
  52. }
  53. func onParticipantCameraVideoAdded(_ participantCameraVideoAddedEvent: ParticipantCameraVideoAddedEvent) {
  54. os_log("Participant added camera video.")
  55. }
  56. func onParticipantCameraVideoRemoved(_ participantCameraVideoRemovedEvent: ParticipantCameraVideoRemovedEvent) {
  57. os_log("Participant removed camera video.")
  58. }
  59. func onParticipantScreenShareAdded(_ participantScreenShareAddedEvent: ParticipantScreenShareAddedEvent) {
  60. os_log("Participant started screen share.")
  61. }
  62. func onParticipantScreenShareRemoved(_ participantScreenShareRemovedEvent: ParticipantScreenShareRemovedEvent) {
  63. os_log("Participant stopped screen share.")
  64. }
  65. func onParticipantDisconnected(_ participantDisconnectedEvent: ParticipantDisconnectedEvent) {
  66. os_log("Participant disconnected.")
  67. }
  68. func onParticipantReconnected(_ participantReconnectedEvent: ParticipantReconnectedEvent) {
  69. os_log("Participant reconnected.")
  70. }
  71. func onReconnecting(_ callReconnectingEvent: CallReconnectingEvent) {
  72. os_log("Room call reconnecting.")
  73. }
  74. func onReconnected(_ callReconnectedEvent: CallReconnectedEvent) {
  75. os_log("Room call reconnected.")
  76. }
  77. func onRoomRecordingStarted(_ roomRecordingStartedEvent: RoomRecordingStartedEvent) {
  78. os_log("Room recording started.")
  79. }
  80. }

When RoomCallEventListener is set up, and you joined the room, there are a few things that you can do with the
actual room call.

One of them is to leave, which can be done via
the leave method. Upon completion,
onParticipantLeft method
will be triggered for the remaining participants in the room call, and for you,
onRoomLeft method will be
triggered.

  1. roomCall.leave()

During the room call, you can also mute/unmute your audio, by calling
the mute method. Upon completion,
onParticipantMuted
/ onParticipantUnmuted
method will be triggered for other participants in the room call.

  1. roomCall.mute(true)

To check if the audio is muted, call the muted
method in the following way:

  1. let audioMuted = roomCall.muted()

Also, you can enable/disable your camera video, by calling
the cameraVideo method. Upon completion,
onParticipantCameraVideoAdded
/ onParticipantCameraVideoRemoved
method will be triggered for other participants in the room call, while for you,
onCameraVideoAdded
/ onCameraVideoRemoved
method will be triggered.

  1. roomCall.cameraVideo(cameraVideo: true)

You can start/stop sharing your screen, by calling
the screenShare method. Upon completion,
onParticipantScreenShareAdded
/ onParticipantScreenShareRemoved
method will be triggered for other participants in the room call, while for you,
onScreenShareAdded
/ onScreenShareRemoved
method will be triggered.

  1. roomCall.screenShare(screenShare: true)