Build a streaming server on your iOS app

Introduction

This is an article from Second Dwango Advent Calendar 2020 Day 20.

Dwango mainly develops iOS apps.

It has nothing to do with the business content, but I tried to verify whether the streaming server can be run on the iOS application by using the function added to AV Asset Writer from iOS 14 and macOS 11.0, so I will write that story.

The code is just an example, so please complement it nicely.

Environment used for verification

macOS 10.15.5
Xcode 12.2
Safari 13.1.1
iPhone 12 Pro iOS 14.2

Output HLS segment and index file from camera image

Acquisition of camera image

Get CMSampleBuffer from AVCaptureDevice.

Since it is a common process, I will omit the code, but it is OK if you can get CMSampleBuffer from AVCaptureVideoDataOutputSampleBufferDelegate.

func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
    ....
}

In addition, this time, to avoid complexity, only the video and audio are omitted.

AVAssetWriter

Initialize AVAssetWriter. Specify .mpeg4AppleHLS for outputFileTypeProfile.

self.writer = AVAssetWriter(contentType: UTType(AVFileType.mp4.rawValue)!)
writer.delegate = self
writer.outputFileTypeProfile = .mpeg4AppleHLS
writer.preferredOutputSegmentInterval = CMTime(seconds: 1.0, preferredTimescale: 1)
writer.initialSegmentStartTime = CMTime.zero
let videoOutputSettings: [String: Any] = [
    AVVideoCodecKey: AVVideoCodecType.h264,
    AVVideoWidthKey: 360,
    AVVideoHeightKey: 640
]
self.videoInput = AVAssetWriterInput(mediaType: .video, outputSettings: videoOutputSettings)
videoInput.expectsMediaDataInRealTime = true
writer.add(videoInput)

Next, write the process when CMSampleBuffer is received. When it is received for the first time, it is put in the writing state and the session is started. If it is being written, the PTS is corrected and CMSampleBuffer is written to AVAssetWriterInput.

if writer.status == .unknown {
    writer.startWriting()
    writer.startSession(atSourceTime: CMTime.zero)
}
if writer.status == .writing {
    if let offset = offset {
        var copyBuffer: CMSampleBuffer?
        var count: CMItemCount = 1
        var info = CMSampleTimingInfo()
        CMSampleBufferGetSampleTimingInfoArray(sampleBuffer, entryCount: count, arrayToFill: &info, entriesNeededOut: &count)
        info.presentationTimeStamp = CMTimeSubtract(info.presentationTimeStamp, offset)
        CMSampleBufferCreateCopyWithNewTiming(allocator: kCFAllocatorDefault,
                                              sampleBuffer: sampleBuffer,
                                              sampleTimingEntryCount: 1,
                                              sampleTimingArray: &info,
                                              sampleBufferOut: &copyBuffer)
        if let copyBuffer = copyBuffer, videoInput.isReadyForMoreMediaData {
            videoInput.append(copyBuffer)
        }
    } else {
        offset = CMSampleBufferGetPresentationTimeStamp(sampleBuffer)
    }
}

After a while, AVAssetWriterDelegate's didOutputSegmentData is called and the segment data is passed.

func assetWriter(_ writer: AVAssetWriter, didOutputSegmentData segmentData: Data, segmentType: AVAssetSegmentType, segmentReport: AVAssetSegmentReport?) {
    ...
}

Creating segment and index files

Next, create a segment data file and an index file from the segment data. This time, create a new directory under the Documents directory and save the segment and index files.

The segment file simply saves the segmentData received by didOutputSegmentData to the file as it is. This time, I simply numbered them in the order they were received and saved them with a file name such as segment1.m4s.

Since some segments are required to create an index file, create or update the file when several are created. The index file must comply with the HLS specifications, but it seems that there is no problem just generating it by referring to Apple's Sample Code.

The index file is updated each time a segment is created and passed.

Set up a server with SwiftNIO Transport Service

swift-nio-transport-services is an extension of SwiftNIO that can be used on iOS, watchOS, and tvOS, and uses Network.framework.

Applications created using SwiftNIO will work fine with the SwiftNIO Transport Service with only a few rewrites.

In addition to providing first-class support for Apple platforms, NIO Transport Services takes advantage of the richer API of Network.framework to provide more insight into the behaviour of the network than is normally available to NIO applications. This includes the ability to wait for connectivity until a network route is available, as well as all of the extra proxy and VPN support that is built directly into Network.framework.

All regular NIO applications should work just fine with NIO Transport Services, simply by changing the event loops and bootstraps in use.

(From GitHub README)

SwiftNIO Transport Service is compatible with CocoaPods, but it can also be installed with Swift Package Manager, so this time we will install it with Swift Package Magager.

In Swift Package Manager, add SwiftNIO and SwiftNIO Transport Service, and add NIO, NIOHTTP1, and NIOTransportServices packages to your dependencies.

After adding, implement the HTTP server by referring to the sample of SwiftNIO Transport Service.

let group = NIOTSEventLoopGroup()
let channel = try! NIOTSListenerBootstrap(group: group)
    .childChannelInitializer { channel in
        channel.pipeline.configureHTTPServerPipeline(withPipeliningAssistance: true, withErrorHandling: true).flatMap {
            channel.pipeline.addHandler(HTTP1ServerHandler())
        }
    }
    .bind(host: "0.0.0.0", port: 8080)
    .wait()
try! channel.closeFuture.wait()

HTTP1ServerHandler conforms to ChannelInboundHandler and implements as follows.

final class HTTP1ServerHandler: ChannelInboundHandler {

    typealias InboundIn = HTTPServerRequestPart
    typealias OutboundOut = HTTPServerResponsePart

    func channelRead(context: ChannelHandlerContext, data: NIOAny) {
        let part = unwrapInboundIn(data)
        guard case .head(let headData) = part else {
            return
        }
        if headData.uri == "/" {
            // index.Processing that returns html as a response
        }
    }
}

This time, in response to the request for /, HTML with the player will be returned. Include the file name as index.html in the Bundle.

<!DOCTYPE html>
<html lang="ja">
  <head>
    <meta charset="utf-8">
    <meta name="viewport" content="width=device-width, initial-scale=1">
    <title>HLS Stream Server</title>
  </head>
  <body>
    <header>
      <h1>HLS Stream Server</h1>
    </header>
    <div>
      <video width="360" height="640" src="index.m3u8" preload="none" onclick="this.play()" controls />
    </div>
  </body>
</html>

To return a response, implement and call the following method.

private func handleIndexPageRequest(context: ChannelHandlerContext, data: NIOAny) {
    do {
        let path = Bundle.main.path(forResource: "index", ofType: "html")!
        let data = try Data(contentsOf: URL(fileURLWithPath: path))
        let buffer = context.channel.allocator.buffer(data: data)
        var responseHeaders = HTTPHeaders()
        responseHeaders.add(name: "Content-Length", value: "\(data.count)")
        responseHeaders.add(name: "Content-Type", value: "text/html; charset=utf-8")
        let responseHead = HTTPResponseHead(version: .init(major: 1, minor: 1), status: .ok, headers: responseHeaders)
        context.write(wrapOutboundOut(.head(responseHead)), promise: nil)
        context.write(wrapOutboundOut(.body(.byteBuffer(buffer))), promise: nil)
        context.writeAndFlush(wrapOutboundOut(.end(nil)), promise: nil)
    } catch {
        let responseHead = HTTPResponseHead(version: .init(major: 1, minor: 1), status: .notFound)
        context.write(wrapOutboundOut(.head(responseHead)), promise: nil)
        context.writeAndFlush(wrapOutboundOut(.end(nil)), promise: nil)
    }
}

Implement the same processing for requests for index files and segment files.

Finally, call the HTTP server, camera, and segment generation process to be executed at the same time.

Verification

From Safari, specify the local IP and port of the iPhone to access. If you check the network from the debug menu, you can see that the index file is being updated and the segment file is being read.

qiita.gif

reference

Author fragmented MPEG-4 content with AVAssetWriter Live Playlist (Sliding Window) Construction [iOS] I made a Vine-style replenishment shooting app with AVFoundation (AVCaptureVideoDataOutput/AVCaptureAudioDataOutput) https://swiftreviewercom.wordpress.com/2020/03/27/import-swift-nio-to-ios-tvos-in-xcode-11/ https://www.process-one.net/blog/swiftnio-introduction-to-channels-channelhandlers-and-pipelines/

Recommended Posts

Build a streaming server on your iOS app
Build a Minecraft server on AWS
Build a JMeter environment on your Mac
Build VNC Server on Ubuntu 20.04
Build a CentOS 8 virtual environment on your Mac with VirtualBox
Build a XAMPP environment on Ubuntu
Using a local network on iOS 14
Build Redmine on Azure App Service
Deploy your Rails app on Heroku
Try deploying a Rails app on EC2-Part 1-
Build a Maven repository on AWS S3
Build a Java development environment on Mac
Deploy a Java web app on Heroku
I made a calculator app on Android
Consolidate your JavaFX app into a jar that runs on both Java 8/11
Build Web Application Server (Java) on VPS
Create a Docker container for your development web server in Ansible on MacOS
Build a test flow on CircleCI using Jib
Build a Kotlin app using OpenJDK's Docker container
Build a DHCP and NAT router on Ubuntu 16.04
Ssh login to the app server on heroku
Build a Laravel environment on an AWS instance
Build a Java runtime environment on Sakura VPS
How to build a Pytorch environment on Ubuntu
Memo to build a Servlet environment on AWS EC2
Build a docker container for a python simple web server
Build a Ruby on Rails development environment on AWS Cloud9
I tried using Log4j2 on a Java EE server
Build a Maven in-house repository on Google Cloud Storage
Steps to set up a VNC server on CentOS 8.3
How to play YouTube videos directly on iOS app