TikTok is one of the world’s most popular social media apps. It allows users to create, share, watch short-form videos, and listen to bit-sized music from various genres. Its vibrant community makes trending and viral videos, allowing users to engage with content through likes and comments.
In this tutorial, we will look at creating our very own TikTok clone using SwiftUI and Stream. By the end of this tutorial, you will learn everything from creating the application structure, implementing the home feed video player and integrating real-time video capabilities using Stream Video so users are able start and record livestreams to others in the application.
Project Requirements
To follow along with this tutorial, you will need an Xcode installation. It is recommended to download Xcode 15 or a later version. Also, to provide access to the user's camera feed in this app, we will use the Stream Video SDK. The Video SDK allows developers to build FaceTime-style video calling, Twitch-like content streaming, Zoom-like video conferencing, and audio room platforms with reusable components. After creating a new SwiftUI project, you can fetch and install Stream Video using Swift Package Manager in Xcode.
Explore the Sample SwiftUI TikTok Clone App
Before you follow the steps in this tutorial, it will be great to have a look and feel how the finished project works. You can get the final sample app from GitHub, open it in Xcode, explore the code structure and discover how it works. The final project already has microphone and camera usage permissions configured. The following section explains how to set those configurations in Xcode.
Create and Configure the SwiftUI Project
Please create a new SwiftUI project in Xcode and name it as you like. The sample project in this tutorial uses TikTokCloneSwiftUI as the app name. The app will rely on audio/sound from the user's device and live video capture from the camera feed. Access to these device capabilities (sound and video) requires setting camera and microphone usage description privacies in Xcode. Select the app's root folder in the Xcode Project Navigator and click the Info tab. Then, click any of the + buttons you see when hovering over the items under Key. Search through the Privacy category and add:
- Privacy - Microphone Usage Description.
- Privacy - Camera Usage Description.
Build the TikTok-Like UIs
The app we built in this tutorial will allow short-form video browsing and self-recording using the iOS device's camera. The primary interaction styles include snap-flicking (paging) to watch bite-sized videos and tapping to record a live video as demonstrated in the image below.
When the app launches, it displays the video feeds TabView with a horizontal paging scroll style. This effect is opposite to that of the original TikTok app, which snap-scrolls vertically. The video feeds use standard SwiftUI views that implement AVKit. In a later article, we will integrate this feature with Stream's Activity Feeds.
Create the Looping Videos
This project will have five looping videos that users can cycle through for demonstration. The video feeds can be implemented using activity feeds in an actual application. Since each looping video uses the same Swift code, let's demonstrate it with the following sample code. When you download the Xcode project from GitHub, you will find all the files in it.
1234567891011121314151617181920212223242526272829303132// // FirstVideoView.swift import SwiftUI import AVFoundation import AVKit struct FirstVideoView: View { @State var player = AVPlayer() let avPlayer = AVPlayer(url: Bundle.main.url(forResource: "oneDancing", withExtension: "mp4")!) var body: some View { ZStack { VideoPlayer(player: avPlayer) .scaledToFill() .ignoresSafeArea() .onAppear { avPlayer.play() avPlayer.actionAtItemEnd = .none NotificationCenter.default.addObserver(forName: .AVPlayerItemDidPlayToEndTime, object: avPlayer.currentItem, queue: .main) { (_) in avPlayer.seek(to: .zero) avPlayer.play() } } } } } #Preview { FirstVideoView() .preferredColorScheme(.dark) }
The sample code above creates a SwiftUI video player that loops forever using AVKit and AVFoundation.
Create the Video Feed Overlays
As indicated in the image above, each of the looping videos has two groups of views overlaid on it. Let's create ReactionButtons1View.swift
to display the user's profile, likes, comments, and sharing ability in a vertical stack parent view. This sample code displays the icons on all the videos when you swipe through them.
12345678910111213141516171819202122232425262728293031323334353637383940414243444546474849505152535455565758596061626364656667// // ReactionButtonsView.swift // TikTokCloneSwiftUI import SwiftUI struct ReactionButtons1View: View { var body: some View { VStack(spacing: 24) { Button { } label: { Image(.profile1) .resizable() .scaledToFit() .frame(width: 60, height: 60) .clipShape(Circle()) .overlay( Circle() .stroke(LinearGradient(gradient: Gradient(colors: [Color.red, Color.blue]), startPoint: .topLeading, endPoint: .bottomTrailing), lineWidth: 2) ) } Button { } label: { VStack { Image(systemName: "suit.heart.fill") .font(.title) Text("5K") } .foregroundStyle(.white) } Button { } label: { VStack { Image(systemName: "message.fill") .font(.title) Text("56") } .foregroundStyle(.white) } Button { } label: { VStack { Image(systemName: "square.and.arrow.up.fill") .font(.title) Text("Share") } .foregroundStyle(.white) } } .padding() } } #Preview { ReactionButtons1View() .preferredColorScheme(.dark) }
Create the TikTok-Like Tab Bar
The tab bar contains five tab items. Only one of the tab items has interactivity, a plus button to launch a live video of users. To create the UI, add FeedsView.swift
to the project and use the sample code below to fill out its content.
123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136// // FeedsView.swift // TikTokCloneSwiftUI // // Created by Amos Gyamfi on 31.5.2024. // import SwiftUI struct FeedsView: View { @State var top = 0 @State private var isLocalVideoShowing = false var body: some View { NavigationStack { ZStack { HTabView() .padding(.top, -200) HStack { Spacer() //ReactionButtons1View() } } .toolbar { ToolbarItem(placement: .principal) { HStack { Button { self.top = 0 } label: { Text("Following") .fontWeight(self.top == 0 ? .bold : .none) .foregroundStyle(self.top == 0 ? .white : .white.opacity(0.5)) .padding(.vertical) } .buttonStyle(.plain) Button { self.top = 1 } label: { Text("For You") .fontWeight(self.top == 1 ? .bold : .none) .foregroundStyle(self.top == 1 ? .white : .white.opacity(0.5)) .padding(.vertical) } .buttonStyle(.plain) } } ToolbarItemGroup { Button { // } label: { Image(systemName: "magnifyingglass") } .buttonStyle(.plain) } ToolbarItemGroup(placement: .bottomBar) { Button { } label: { VStack { Image(systemName: "house.fill") Text("Home") .font(.caption) } } .buttonStyle(.plain) Spacer() Button { } label: { VStack { Image(systemName: "person.2") Text("Friends") .font(.caption) } } .buttonStyle(.plain) Spacer() Button { isLocalVideoShowing.toggle() } label: { Image(systemName: "plus.rectangle.fill") } .font(.title3) .buttonStyle(.plain) .foregroundStyle(.black) .padding(EdgeInsets(top: 0, leading: 2, bottom: 0, trailing: 2)) .background(LinearGradient(gradient: Gradient(colors: [.teal, .red]), startPoint: .leading, endPoint: .trailing)) .cornerRadius(6) .fullScreenCover(isPresented: $isLocalVideoShowing, content: CreateJoinLiveVideo.init) Spacer() Button { } label: { VStack { Image(systemName: "tray") Text("Inbox") .font(.caption) } } .buttonStyle(.plain) Spacer() Button { } label: { VStack { Image(systemName: "person") Text("Profile") .font(.caption) } } .buttonStyle(.plain) } } } } } #Preview { FeedsView() .preferredColorScheme(.dark) }
Create the Live Video Overlays
The live video overlays consist of a video settings view (vertical container), duration options (horizontal container), and recording UI (pink button), as demonstrated in the image above. Create LiveVideoSettingsView.swift
and substitute its content with the sample code below.
12345678910111213141516171819202122232425262728293031323334353637383940414243444546474849505152535455// // LiveVideoSettingsView.swift // TikTokCloneSwiftUI // // Created by Amos Gyamfi on 1.6.2024. // import SwiftUI struct LiveVideoSettingsView: View { var body: some View { VStack(spacing: 24) { Button { // } label: { Image(systemName: "bolt.slash.fill") } Button { // } label: { Image(systemName: "timer") } Button { // } label: { Image(systemName: "camera.filters") } Button { // } label: { Image(systemName: "camera.aperture") } Button { // } label: { Image(systemName: "wand.and.stars") } } .font(.title2) .bold() .buttonStyle(.plain) .padding() .background(.quaternary) .cornerRadius(32) } } #Preview { LiveVideoSettingsView() .preferredColorScheme(.dark) }
Also, create LiveVideoOptionsView.swift
and replace the template code with the following.
123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657// // LiveVideoOptionsView.swift // TikTokCloneSwiftUI // // Created by Amos Gyamfi on 1.6.2024. // import SwiftUI struct LiveVideoOptionsView: View { var body: some View { HStack(spacing: 20) { Button { // } label: { Text("10m") } Button { // } label: { Text("60s") } Button { // } label: { Text("15s") } .buttonStyle(.plain) .padding(EdgeInsets(top: 4, leading: 8, bottom: 4, trailing: 8)) .background(.tertiary) .cornerRadius(16) Button { // } label: { Text("Photo") } Button { // } label: { Text("Text") } } .buttonStyle(.plain) .padding() .background(.quaternary) .cornerRadius(32) } } #Preview { LiveVideoOptionsView() .preferredColorScheme(.dark) }
The sample code below creates the recording view in RecordingView.swift
.
1234567891011121314151617181920212223242526// // RecordingView.swift // TikTokCloneSwiftUI // // Created by Amos Gyamfi on 2.6.2024. // import SwiftUI struct RecordingView: View { var body: some View { ZStack { Circle() .fill(.pink) .frame(width: 64, height: 64) Circle() .stroke(lineWidth: 4) .frame(width: 72, height: 72) } } } #Preview { RecordingView() .preferredColorScheme(.dark) }
Check all the folders in the Xcode Project Navigator for other files related to the app's UI.
Install and Configure the Video SDK
Supposing we are building a production app with Stream's iOS Video SDK, we will need authenticated user credentials from a server. Since the demo app in this tutorial is purposely for development, we will use hard-coded user credentials from the SDK's video calling tutorial. You can use the API key of your Stream account and the companion token generator service to generate random users and tokens for development testing. Check out the get started guide to learn more.
To access and work with the SDK, we need to install it as a dependency in the Xcode project. Select File -> Add Package Dependencies… and paste this URL, https://github.com/GetStream/stream-video-swift, into the search box to install it.
Create a Live Video Participant View
For example, when building an [iOS video conferencing app with the Video SDK, you need to render both the local and remote participants' videos and call controls for muting, flipping the camera from front to back, accepting and rejecting calls. However, the above is not a requirement for our TikTokClone. We need to show only the live video of the local participants. Add ParticipantsView.swift
to the project and fill out its content with the code below.
12345678910111213141516171819202122232425262728293031323334353637383940414243444546474849505152535455565758596061626364656667686970717273747576777879// // ParticipantsView.swift // TikTokCloneSwiftUI // // Created by Amos Gyamfi on 1.6.2024. // import SwiftUI import StreamVideo import StreamVideoSwiftUI struct ParticipantsView: View { var call: Call var participants: [CallParticipant] var onChangeTrackVisibility: (CallParticipant?, Bool) -> Void var body: some View { GeometryReader { proxy in if !participants.isEmpty { ScrollView { LazyVStack { if participants.count == 1, let participant = participants.first { makeCallParticipantView(participant, frame: proxy.frame(in: .global)) .frame(width: proxy.size.width, height: proxy.size.height) } else { ForEach(participants) { participant in makeCallParticipantView(participant, frame: proxy.frame(in: .global)) .frame(width: proxy.size.width, height: proxy.size.height / 2) } } } } } else { Color.black } } .edgesIgnoringSafeArea(.all) } @ViewBuilder private func makeCallParticipantView(_ participant: CallParticipant, frame: CGRect) -> some View { VideoCallParticipantView( participant: participant, availableFrame: frame, contentMode: .scaleAspectFit, customData: [:], call: call ) .onAppear { onChangeTrackVisibility(participant, true) } .onDisappear{ onChangeTrackVisibility(participant, false) } } } // Floating Participant struct FloatingParticipantView: View { var participant: CallParticipant? //var size: CGSize = .init(width: 120, height: 120) var size: CGSize = .init(width: UIScreen.main.bounds.width, height: UIScreen.main.bounds.height) var body: some View { if let participant = participant { VStack { HStack { Spacer() VideoRendererView(id: participant.id, size: size) { videoRenderer in videoRenderer.handleViewRendering(for: participant, onTrackSizeUpdate: { _, _ in }) } .frame(width: size.width, height: size.height) .clipShape(RoundedRectangle(cornerRadius: 8)) } Spacer() } .padding() } } }
In summary, the above sample code does the following:
- Imports the required dependencies: The
StreamVideo
dependency is the core SDK. It does not contain any UI, so it is an excellent choice if you want to build a fully custom TikTok-like experience. Our demo app will use the SDK's reusable UI components by importingStreamVideoSwiftUI
. - Manages the layout with Geometry Reader.
- Renders and displays the local participants.
- Watches for visibility changes of the participant.
How to Capture a Device’s Camera Feed
The video SDK allows developers to access and display local and remote participants' device camera feeds when [building a video calling app] like WhatsApp. In the context of our TikTok clone, we will use the SDK's' VideoRenderer
to display only the local participant's video. We use the VideoRender because our app's use case does not require CallControls. To render audio/video calling experiences consisting of active, incoming, and outgoing call screens, you should use the SDK's CallContainer
. Visit our documentation to learn more about the CallContainer.
To get a live video from the iOS device’s camera feed, add a new Swift file CreateJoinLiveVideo.swift
. Then, replace the template code with the following.
123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157158import SwiftUI import StreamVideo import StreamVideoSwiftUI struct CreateJoinLiveVideo: View { @State var call: Call @ObservedObject var state: CallState @State var callCreated: Bool = false @State private var isRecording = false private var client: StreamVideo private let apiKey: String = "YOUR API KEY" // The API key can be found in the Credentials section private let token: String = "YOUR USER TOKEN" // The Token can be found in the Credentials section private let userId: String = "Boba_Fett" // The User Id can be found in the Credentials section private let callId: String = "YOUR CALL ID" // The CallId can be found in the Credentials section @Environment(\.dismiss) private var dismiss init() { let user = User( id: userId, name: "Martin", // name and imageURL are used in the UI imageURL: .init(string: "https://getstream.io/static/2796a305dd07651fcceb4721a94f4505/a3911/martin-mitrevski.webp") ) // Initialize Stream Video client self.client = StreamVideo( apiKey: apiKey, user: user, token: .init(stringLiteral: token) ) // Initialize the call object let call = client.call(callType: "default", callId: callId) self.call = call self.state = call.state } var body: some View { NavigationStack { VStack { if callCreated { ZStack { ParticipantsView( call: call, participants: call.state.remoteParticipants, onChangeTrackVisibility: changeTrackVisibility(_:isVisible:) ) FloatingParticipantView(participant: call.state.localParticipant) VStack { HStack { Spacer() LiveVideoSettingsView() } .padding(.horizontal, 32) HStack { Spacer() EffectsButtonView() Spacer() Button { isRecording.toggle() if isRecording { func startRecording() { Task { try await call.startRecording() } } } else { func stopRecording() { Task { try await call.stopRecording() } } } } label: { RecordingView() } .buttonStyle(.plain) Spacer() UploadButtonView() Spacer() } .padding(.top, 128) } } } else { //Text("loading...") ProgressView() } } .onAppear { Task { guard callCreated == false else { return } try await call.join(create: true) callCreated = true } } .toolbar { ToolbarItem(placement: .topBarLeading) { Button { dismiss() } label: { Image(systemName: "xmark") } .buttonStyle(.plain) } ToolbarItem(placement: .principal) { Button { } label: { HStack { Image(systemName: "music.quarternote.3") Text("Add sound") } .font(.caption) } .buttonStyle(.plain) .padding(EdgeInsets(top: 8, leading: 10, bottom: 8, trailing: 10)) .background(.quaternary) .cornerRadius(8) } ToolbarItemGroup(placement: .topBarTrailing) { Button { } label: { Image(systemName: "arrow.triangle.2.circlepath") } .buttonStyle(.plain) } ToolbarItem(placement: .bottomBar) { LiveVideoOptionsView() } } } } /// Changes the track visibility for a participant (not visible if they go off-screen). /// - Parameters: /// - participant: the participant whose track visibility would be changed. /// - isVisible: whether the track should be visible. private func changeTrackVisibility(_ participant: CallParticipant?, isVisible: Bool) { guard let participant else { return } Task { await call.changeTrackVisibility(for: participant, isVisible: isVisible) } } }
To summarize the sample code above, we create an instance of the StreamVideo
client and a user with a hard-coded token and API key. We check to see if a call has not been created, and then we create and join it to display a live video. The method, call.join(create: true)
allows real-time sound and video transmission. With the above sample code, our TikTok clone app is ready. Let’s implement live video recording in the next step.
Add a Recording Functionality
When users start a live video, the SDK provides an easy way to integrate recording. This feature allows users to record ongoing audio and video activities. To use the SDK's recording functionality, create a state variable to toggle between the user's recording and non-recording states @State private var isRecording = false
. Then, add a button to start and stop recording.
12345678910111213141516171819Button { isRecording.toggle() if isRecording { func startRecording() { Task { try await call.startRecording() } } } else { func stopRecording() { Task { try await call.stopRecording() } } } } label: { RecordingView() } .buttonStyle(.plain)
You can see the recording implementation in CreateJoinLiveVideo.swift
you added in the previous section.
Congratulations 👏. You deserve applause for following this step-by-step guide to building a fully functioning TikTok-like live video using SwiftUI and Stream's iOS Video SDK.
What’s Next?
This tutorial guided you in integrating short-form video support into a SwiftUI app to build a functioning TikTok clone. However, we did not cover the endless scrollable video feeds feature using a third-party activity feeds solution. That will come in a future tutorial. You can easily integrate activity feeds in your app with the Stream Activity Feeds API.
Stream’s Video SDK has much more to offer you as a developer. Head to the iOS docs to learn more about integrating screen sharing, initiating calls from deep links, video/ audio filters, and more.