Leveraging Multicore Processors (M1 & M2) for Delay Sensitive Audio Application Development in MacOS

mrasyadc

Muhammad Rasyad Caesarardhi

Posted on May 31, 2024

Leveraging Multicore Processors (M1 & M2) for Delay Sensitive Audio Application Development in MacOS

Hi, in this article i will tell you a story about how i make a delay sensitive application such as music tools apps. in this scenario, i build DrumCentral, a natively built virtual drum application on macOS.

It was built under two weeks and i learned a ton from this. Apple Tech such as Dispatch Queue, AV Foundation, SwiftUI, and Figma was used in this project.

Why Build a New Drum App?

I love playing drums! it sparks my joy and creativity with beats and music. In my spare time, I could casually listen to a song and find out that the beats are so nice that I want to make my version of the beat itself. I’m more enjoy playing the beat more on my Mac using Virtual Drumming (https://www.virtualdrumming.com) a web version app of Real Drum on Mobile Devices and iPad. but it was frustrating when I didn’t have an internet connection and I couldn’t wait for the webpage to load because I had other tasks to do (I play the drum for stress relief lol). So I decided to build my implementation of virtual drum natively on macOS using Swift and SwiftUI. Why Swift? well, Apple Tech surely sparks my interest in building something on top of their devices. So off we went and built the app, gathered all the assets, and sound, and coded it in less than 10 days.

How to detect a key being pressed?

the first challenge is to detect whether a key in a keyboard is being pressed. and how did we change the state of the app and re-render when a key is being pressed? I spent a lot of time trying to make it work, I tried pooling rate and registering CGKeyCode manually to a KeyboardManager and it works (unfortunately for Space Key only). so I tried again and found that SwiftUI has a native UseKeyPress Event that automatically detects key presses and handles re-render without implementing meticulous code and it was easy to read code. love it!

Text("").focusable()
  .focused($isFocused)
  .onAppear {
    isFocused = true
  }
  .onKeyPress(
    phases: .down,
    action: { keyPress in
      switch keyPress.characters {
      case "c":
        playSnare()
      case "x":
        playHiHat()
      default:
        print("default")
      }
      return .handled
    })
Enter fullscreen mode Exit fullscreen mode

The trick is to make an empty Text Element that is focusable (if the element is not focused, the key press won’t update the UI). then, add the .onKeyPress function. when using .onKeyPress we can detect the key being pressed by using the parameters for action arguments like keyPress.characters.

How to play sound using AVFoundation

the next challenge is how do we play a sound inside a macOS? I gathered information and researched through the Apple documentation for developers and found out about AVFoundation. it was simple enough to play a sound using AVFoundation like the code below.

if let soundURL = Bundle.main.url(forResource: "hihat", withExtension: "wav") {
  do {
    audioPlayer = try AVAudioPlayer(contentsOf: soundURL)
    audioPlayer.prepareToPlay()
    audioPlayer.play()
  } catch {
    print("Error: Could not initialize AVAudioPlayer - \(error.localizedDescription)")
  }
} else {
  print("Error: Sound file not found")
}
Enter fullscreen mode Exit fullscreen mode

to use the asset sound we need to copy the files or just drag and drop it to Xcode and use the settings below. it will make the sound copied to the project and visible to others if we want to collaborate on Git.

Xcode Settings for importing sound file

What happen??! the sound is not responsive to the user's keyboard press

okay, now we found that the sound is not too responsive to my keyboard press. when I press “B” 3 times, only 1 sound of Kick Drum comes out. so now the challenge is how do we make it super responsive for the app to output sound when a key is pressed? first, I tried to modify the class of AVFoundation but there was no result, so I needed to change my approach. rather than modifying the class of AVFoundation, why don’t we wrap each AVFoundation instance into a new Threads? Eureka! That is the answer! now, we need to learn about how we do multithreading that utilizes all cores and threads on the Apple M1 and M2 processors. After thorough research, I found that we can utilize multithreading using a feature called DispatchQueue. Below is the code on how multithreading with DispatchQueue looks like.

DispatchQueue.global().async {
  //do something
  playSound()
}
Enter fullscreen mode Exit fullscreen mode

when this code is run by the processor, the program will create a new thread for the playSound function and remove it appropriately. Since there will be no passing data between each thread, this threading is called non-data sharing multithreading.

Bonus: Readable and Clean Code using AudioManager and Factory

last one, the app is now fully functional but at a cost. the cost is the code looks terrible right now. how do we make it better? ———give your best answer here in the comment———

Hint: I found that we could extract our audio-playing AVFoundation library into a brand new AudioManager. each of the AudioManager instances would have unique sounds so the music will be a new argument when instantiating AudioManager. Since the drum set will have a lot of kits like HiHat (open), HiHat (closed), Ride, Cymbals, Snare, High Tom, Low Tom, Floor Tom, and Kick we could instantiate all of the kits using Factory enum.

// AudioManager.swift

import AVFoundation

final class AudioManager {
  private var audioFileName: String
  private var audioFileType: String
  private var audioPlayer: AVAudioPlayer

  init?(audioFileName: String, fileType: String) {
    self.audioFileName = audioFileName
    self.audioFileType = fileType
    self.audioPlayer = AVAudioPlayer()
  }

  func playSound() {
    DispatchQueue.global().async {
      self._playSound(soundFileName: self.audioFileName, fileType: self.audioFileType)
    }

  }

  func _playSound(soundFileName: String, fileType: String) {
    if let soundURL = Bundle.main.url(forResource: soundFileName, withExtension: fileType) {
      do {
        audioPlayer = try AVAudioPlayer(contentsOf: soundURL)
        audioPlayer.prepareToPlay()
        audioPlayer.play()
      } catch {
        print("Error: Could not initialize AVAudioPlayer - \(error.localizedDescription)")
      }
    } else {
      print("Error: Sound file not found")
    }
  }
}

Enter fullscreen mode Exit fullscreen mode

below is the code for using SoundFactory

// SoundFactory.swift

enum SoundFactory {
  static let CRASH = AudioManager(audioFileName: "crash", fileType: "wav")
  static let FLOORTOM = AudioManager(audioFileName: "floortom", fileType: "wav")
  static let HIHAT = AudioManager(audioFileName: "hihat", fileType: "wav")
  static let KICK = AudioManager(audioFileName: "kick", fileType: "wav")
  static let HIHATOPEN = AudioManager(audioFileName: "hihatopen", fileType: "wav")
  static let RIDE = AudioManager(audioFileName: "ride", fileType: "wav")
  static let SNARE = AudioManager(audioFileName: "Snare_12_649", fileType: "wav")
  static let TOMHIGH = AudioManager(audioFileName: "tomhigh", fileType: "wav")
  static let TOMLOW = AudioManager(audioFileName: "tomlow", fileType: "wav")
  static let OPENING = AudioManager(audioFileName: "opening", fileType: "wav")
}
Enter fullscreen mode Exit fullscreen mode

to play the drum sound we could use as simple as the code below

SoundInitializer.SNARE?.playSound()
Enter fullscreen mode Exit fullscreen mode

and that's all!

Muhammad Rasyad Caesarardhi

Apple Developer Academy - BINUS Cohort 7

notes:

*big thanks to my partner Alifiyah Ariandri (a.k.a. al) for making the drum assets. the name “al” is the inspiration for BeatCentral

💖 💪 🙅 🚩
mrasyadc
Muhammad Rasyad Caesarardhi

Posted on May 31, 2024

Join Our Newsletter. No Spam, Only the good stuff.

Sign up to receive the latest update from our blog.

Related