I recently played around recording audio on my Apple Watch. Doing that was easy enough, and I wanted to send the recorded file from the watch extension to the iOS application – and play it. I started messing around and used sendFile as the communication protocol since it can work in the background (without the iOS app needing to be open). I banged on it a ton and it worked – but I couldn’t figure out a way to actually play it. The file.fileURL.path had some dynamic stuff in the URL which prevented me from successfully creating a valid AVAudioPlayer.
Here is my solution in sending a file (wav) to the iPhone and playing it upon receipt.
//Watch Extension code
override func awake(withContext context: Any?) {
super.awake(withContext: context)
if WCSession.isSupported() {
WCSession.default().delegate = self
WCSession.default().activate()
}
let fileManager = FileManager.default
let container = fileManager.containerURL(forSecurityApplicationGroupIdentifier: "group.net.ericd.WatchRecord")
let fileName = "audioFile.wav"
// class variable
saveURL = container?.appendingPathComponent(fileName) as NSURL?
}
@IBAction func sendAudio() {
let data = NSData(contentsOf: saveURL as! URL)
sendAudioFile(file: data!) // Quicker.
}
func sendAudioFile(file: NSData) {
WCSession.default().sendMessageData(file as Data, replyHandler: { (data) -> Void in
// handle the response from the device
}) { (error) -> Void in
print("error: \(error.localizedDescription)")
}
}
The presentAudioRecorderController, when successfully saving the audio file recorded, enables a button that calls the sendAudio function. That code is here.
@IBAction func recordAudio() {
let duration = TimeInterval(10)
let recordingOptions = [WKAudioRecorderControllerOptionsMaximumDurationKey: duration]
print("record:", saveURL as! URL)
presentAudioRecorderController(withOutputURL: saveURL as! URL,
preset: .narrowBandSpeech,
options: recordingOptions,
completion: { saved, error in
if let err = error {
print(err.localizedDescription)
}
if saved {
print("saved file.")
self.playButton.setAlpha(1.0)
self.sendButton.setAlpha(1.0)
self.playButton.setEnabled(true)
self.sendButton.setEnabled(true)
}
})
}
Now, in the iOS application, I handle the receipt of the sendMessageData method.
func session(_ session: WCSession, didReceiveMessageData messageData: Data, replyHandler: @escaping (Data) -> Void)
{
DispatchQueue.main.async
{
self.someLabel.text = "We got an audio file: \(messageData)" //Show bytes
self.versionLabel.textColor = UIColor.blue
do {
self.player = try AVAudioPlayer(data: messageData)
guard self.player != nil else { return }
self.player?.prepareToPlay()
self.player?.play()
} catch let error as NSError {
print(error.localizedDescription)
}
}
}
And there the audio file is rendered upon receipt. Not using a file transfer (which I would prefer since it’s a background task that doesn’t require the iOS application to be running in the foreground).